Apr 22 14:12:23.386492 ip-10-0-133-31 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 14:12:23.386504 ip-10-0-133-31 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 14:12:23.386514 ip-10-0-133-31 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 14:12:23.386825 ip-10-0-133-31 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 14:12:33.385788 ip-10-0-133-31 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 14:12:33.385809 ip-10-0-133-31 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 9f93c6b9257e46ac8c5d98069a4fe5e2 -- Apr 22 14:14:56.169940 ip-10-0-133-31 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:14:56.679150 ip-10-0-133-31 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:14:56.679150 ip-10-0-133-31 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:14:56.679150 ip-10-0-133-31 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:14:56.679150 ip-10-0-133-31 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:14:56.679150 ip-10-0-133-31 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:14:56.679818 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.679198 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:14:56.681487 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681472 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:56.681487 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681487 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681491 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681494 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681497 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681500 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681503 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681507 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681511 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681514 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681517 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681520 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681523 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681525 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681528 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681536 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681539 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681541 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681544 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681547 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:56.681555 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681550 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681552 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681555 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681558 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681560 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681564 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681567 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681570 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681572 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681575 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681577 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681580 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681583 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681586 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681588 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681591 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681594 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681597 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681599 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681602 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:56.682022 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681604 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681607 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681610 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681612 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681615 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681618 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681620 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681623 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681625 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681628 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681630 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681633 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681635 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681638 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681642 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681658 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681661 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681665 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681668 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681671 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:56.682546 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681673 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681676 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681679 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681681 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681684 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681687 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681690 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681693 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681695 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681698 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681701 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681703 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681706 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681709 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681712 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681714 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681717 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681721 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681725 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:56.683052 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681729 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681732 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681734 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681737 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681740 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681743 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.681745 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682137 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682143 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682146 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682149 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682152 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682155 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682157 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682160 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682163 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682165 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682168 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682171 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682174 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:56.683504 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682177 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682179 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682182 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682184 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682187 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682189 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682192 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682195 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682197 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682200 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682202 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682205 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682207 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682210 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682212 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682215 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682218 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682220 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682223 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682226 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:56.683992 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682229 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682232 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682235 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682238 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682240 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682243 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682245 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682249 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682253 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682256 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682259 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682263 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682267 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682270 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682273 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682275 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682278 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682281 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682284 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:56.684485 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682287 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682289 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682292 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682294 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682297 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682299 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682302 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682304 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682307 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682310 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682312 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682314 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682317 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682320 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682323 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682325 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682327 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682330 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682332 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682335 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:56.684968 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682337 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682340 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682342 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682344 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682347 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682349 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682352 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682356 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682358 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682365 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682368 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682371 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682373 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.682376 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683379 2562 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683388 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683394 2562 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683399 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683403 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683406 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683411 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:14:56.685499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683415 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683418 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683422 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683426 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683429 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683432 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683435 2562 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683438 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683441 2562 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683444 2562 flags.go:64] FLAG: --cloud-config="" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683447 2562 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683450 2562 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683455 2562 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683458 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683461 2562 flags.go:64] FLAG: --config-dir="" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683464 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683468 2562 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683472 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683475 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683479 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683482 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683486 2562 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683488 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683491 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683495 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:14:56.686026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683497 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683502 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683505 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683508 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683511 2562 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683514 2562 flags.go:64] FLAG: --enable-server="true" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683517 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683521 2562 flags.go:64] FLAG: --event-burst="100" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683525 2562 flags.go:64] FLAG: --event-qps="50" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683528 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683531 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683534 2562 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683538 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683541 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683544 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683547 2562 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683549 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683552 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683555 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683558 2562 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683561 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683564 2562 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683566 2562 flags.go:64] FLAG: --feature-gates="" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683570 2562 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683573 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:14:56.686633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683576 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683580 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683584 2562 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683587 2562 flags.go:64] FLAG: --help="false" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683590 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-133-31.ec2.internal" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683593 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683596 2562 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683599 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683603 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683607 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683610 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683612 2562 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683615 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683618 2562 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683620 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683623 2562 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683626 2562 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683629 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683633 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683636 2562 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683638 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683641 2562 flags.go:64] FLAG: --lock-file="" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683644 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683658 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:14:56.687248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683662 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683667 2562 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683670 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683673 2562 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683676 2562 flags.go:64] FLAG: --logging-format="text" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683679 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683682 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683685 2562 flags.go:64] FLAG: --manifest-url="" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683688 2562 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683692 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683696 2562 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683700 2562 flags.go:64] FLAG: --max-pods="110" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683703 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683706 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683709 2562 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683712 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683715 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683718 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683721 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683728 2562 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683731 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683735 2562 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683738 2562 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:14:56.687838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683741 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683746 2562 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683750 2562 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683753 2562 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683756 2562 flags.go:64] FLAG: --port="10250" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683759 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683762 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05c1659189b17a776" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683765 2562 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683768 2562 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683771 2562 flags.go:64] FLAG: --register-node="true" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683774 2562 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683776 2562 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683780 2562 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683783 2562 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683786 2562 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683789 2562 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683793 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683795 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683798 2562 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683801 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683810 2562 flags.go:64] FLAG: --runonce="false" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683813 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683816 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683820 2562 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683823 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683826 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:14:56.688391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683829 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683832 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683836 2562 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683838 2562 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683841 2562 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683845 2562 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683848 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683851 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683854 2562 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683857 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683862 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683865 2562 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683868 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683873 2562 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683876 2562 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683879 2562 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683881 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683884 2562 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683887 2562 flags.go:64] FLAG: --v="2" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683891 2562 flags.go:64] FLAG: --version="false" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683895 2562 flags.go:64] FLAG: --vmodule="" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683900 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.683903 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.683997 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684004 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:56.689044 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684009 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684012 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684016 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684019 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684021 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684024 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684027 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684029 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684032 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684035 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684037 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684040 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684043 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684045 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684048 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684050 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684053 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684056 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684058 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684061 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:56.689633 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684064 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684066 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684069 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684072 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684074 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684077 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684079 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684082 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684084 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684087 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684089 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684093 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684097 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684100 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684102 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684105 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684108 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684111 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684113 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684116 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:56.690422 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684119 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684122 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684124 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684127 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684130 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684132 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684135 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684137 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684140 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684143 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684145 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684148 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684150 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684153 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684155 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684158 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684160 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684163 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684165 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684168 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:56.691318 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684170 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684174 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684178 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684183 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684188 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684191 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684194 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684197 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684201 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684204 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684206 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684209 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684211 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684214 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684217 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684219 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684222 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684225 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684229 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:56.692181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684231 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684234 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684236 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684239 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.684241 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.684707 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.692145 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.692167 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692265 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692274 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692280 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692285 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692290 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692294 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692299 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:56.693043 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692304 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692309 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692314 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692318 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692322 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692327 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692333 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692337 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692342 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692346 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692351 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692355 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692360 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692364 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692368 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692372 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692377 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692381 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692386 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692390 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:56.693608 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692394 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692399 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692404 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692408 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692414 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692419 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692423 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692427 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692431 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692435 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692440 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692444 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692448 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692452 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692456 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692460 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692464 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692469 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692473 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692477 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:56.694125 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692482 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692486 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692490 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692495 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692499 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692503 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692507 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692511 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692515 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692519 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692523 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692528 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692532 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692536 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692542 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692548 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692554 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692560 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692565 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692569 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:56.694624 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692573 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692577 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692581 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692588 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692594 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692599 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692604 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692608 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692612 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692617 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692621 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692625 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692629 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692633 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692638 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692642 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692665 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692671 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:56.695257 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692674 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.692683 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692871 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692881 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692886 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692891 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692895 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692899 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692903 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692907 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692911 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692918 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692927 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692932 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692936 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:56.695749 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692940 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692947 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692953 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692957 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692962 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692966 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692971 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692975 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692979 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692983 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692988 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692992 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.692996 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693001 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693005 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693009 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693013 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693017 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693022 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693026 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:56.696408 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693030 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693034 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693039 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693042 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693047 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693050 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693054 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693058 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693063 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693066 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693071 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693076 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693081 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693085 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693089 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693093 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693097 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693101 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693106 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:56.697262 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693110 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693115 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693122 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693126 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693131 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693136 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693140 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693144 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693149 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693153 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693157 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693161 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693165 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693169 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693173 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693178 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693182 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693186 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693190 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:56.697811 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693195 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693199 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693203 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693207 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693211 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693216 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693221 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693225 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693229 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693233 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693237 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693241 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693245 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693249 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:56.693253 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:56.698375 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.693262 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:14:56.698802 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.694768 2562 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:14:56.698802 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.697817 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:14:56.698965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.698951 2562 server.go:1019] "Starting client certificate rotation" Apr 22 14:14:56.699102 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.699047 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:14:56.699102 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.699085 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:14:56.729247 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.729214 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:14:56.732404 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.732153 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:14:56.743793 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.743773 2562 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:14:56.749675 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.749659 2562 log.go:25] "Validated CRI v1 image API" Apr 22 14:14:56.751578 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.751555 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:14:56.755833 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.755812 2562 fs.go:135] Filesystem UUIDs: map[15490335-6781-4cf5-8f95-2013090e8d27:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 d9fce29c-b0f5-4df7-b5ce-c71daa614d9b:/dev/nvme0n1p3] Apr 22 14:14:56.755895 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.755833 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:14:56.758068 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.758050 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:14:56.761593 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.761481 2562 manager.go:217] Machine: {Timestamp:2026-04-22 14:14:56.759593858 +0000 UTC m=+0.458954532 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103211 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27aea0b15ce8f03ceeac4cb128bf4a SystemUUID:ec27aea0-b15c-e8f0-3cee-ac4cb128bf4a BootID:9f93c6b9-257e-46ac-8c5d-98069a4fe5e2 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:11:96:40:92:21 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:11:96:40:92:21 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:66:d1:45:f9:54:d4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:14:56.761593 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.761587 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:14:56.761710 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.761677 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:14:56.762764 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.762738 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:14:56.762899 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.762766 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-31.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:14:56.762944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.762908 2562 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:14:56.762944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.762917 2562 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:14:56.762944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.762930 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:14:56.763712 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.763702 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:14:56.764413 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.764403 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:14:56.764517 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.764508 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:14:56.767171 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.767162 2562 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:14:56.767206 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.767175 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:14:56.767206 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.767188 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:14:56.767206 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.767198 2562 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:14:56.767206 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.767205 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:14:56.768367 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.768352 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:14:56.768367 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.768370 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:14:56.771487 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.771468 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:14:56.772777 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.772764 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:14:56.774681 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774666 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:14:56.774681 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774683 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774690 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774696 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774701 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774707 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774713 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774718 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774725 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774732 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774741 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:14:56.774807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.774750 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:14:56.775612 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.775602 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:14:56.775612 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.775612 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:14:56.780411 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.780386 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dltht" Apr 22 14:14:56.783082 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.783054 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-31.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 14:14:56.783254 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.783234 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-31.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:14:56.783311 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.783235 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:14:56.783824 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.783810 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:14:56.783872 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.783855 2562 server.go:1295] "Started kubelet" Apr 22 14:14:56.783945 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.783923 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:14:56.784055 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.784011 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:14:56.784094 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.784082 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:14:56.784792 ip-10-0-133-31 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:14:56.785111 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.785095 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:14:56.786428 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.786413 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dltht" Apr 22 14:14:56.786517 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.786475 2562 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:14:56.789758 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.788622 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-31.ec2.internal.18a8b36465cef6e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-31.ec2.internal,UID:ip-10-0-133-31.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-31.ec2.internal,},FirstTimestamp:2026-04-22 14:14:56.783824611 +0000 UTC m=+0.483185286,LastTimestamp:2026-04-22 14:14:56.783824611 +0000 UTC m=+0.483185286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-31.ec2.internal,}" Apr 22 14:14:56.791859 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.791841 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:14:56.791954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.791859 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:14:56.792699 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.792680 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:14:56.792699 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.792703 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:14:56.792822 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.792685 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:14:56.792822 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.792724 2562 factory.go:55] Registering systemd factory Apr 22 14:14:56.792822 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.792738 2562 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:14:56.792822 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.792751 2562 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:14:56.792822 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.792761 2562 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:14:56.792822 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.792710 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:14:56.793107 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.792820 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:56.794587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.794569 2562 factory.go:153] Registering CRI-O factory Apr 22 14:14:56.794587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.794587 2562 factory.go:223] Registration of the crio container factory successfully Apr 22 14:14:56.794750 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.794614 2562 factory.go:103] Registering Raw factory Apr 22 14:14:56.794750 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.794627 2562 manager.go:1196] Started watching for new ooms in manager Apr 22 14:14:56.795240 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.795224 2562 manager.go:319] Starting recovery of all containers Apr 22 14:14:56.795605 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.795576 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:14:56.802491 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.802455 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:14:56.806906 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.806763 2562 manager.go:324] Recovery completed Apr 22 14:14:56.808796 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.808768 2562 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-31.ec2.internal\" not found" node="ip-10-0-133-31.ec2.internal" Apr 22 14:14:56.809628 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.809611 2562 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 14:14:56.812947 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.812935 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:14:56.815450 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.815435 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:14:56.815519 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.815462 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:14:56.815519 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.815473 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:14:56.815909 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.815894 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:14:56.815945 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.815910 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:14:56.815945 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.815926 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:14:56.818951 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.818940 2562 policy_none.go:49] "None policy: Start" Apr 22 14:14:56.818990 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.818956 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:14:56.818990 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.818966 2562 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.858201 2562 manager.go:341] "Starting Device Plugin manager" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.858236 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.858246 2562 server.go:85] "Starting device plugin registration server" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.858471 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.858483 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.858594 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.858717 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.858726 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.859180 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:14:56.871039 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.859216 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:56.919002 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.918967 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:14:56.920193 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.920177 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:14:56.920255 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.920207 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:14:56.920255 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.920231 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:14:56.920255 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.920241 2562 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:14:56.920353 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.920274 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:14:56.922686 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.922669 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:14:56.958846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.958775 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:14:56.959704 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.959685 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:14:56.959814 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.959719 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:14:56.959814 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.959733 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:14:56.959814 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.959756 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-31.ec2.internal" Apr 22 14:14:56.967872 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:56.967854 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-31.ec2.internal" Apr 22 14:14:56.967932 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.967878 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-31.ec2.internal\": node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:56.981306 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:56.981287 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.020816 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.020783 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal"] Apr 22 14:14:57.020913 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.020882 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:14:57.021839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.021825 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:14:57.021915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.021854 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:14:57.021915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.021869 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:14:57.024295 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.024281 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:14:57.025018 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.025001 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:14:57.025097 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.025028 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:14:57.025097 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.025038 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:14:57.025250 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.025237 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.025283 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.025264 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:14:57.025915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.025899 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:14:57.025979 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.025923 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:14:57.025979 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.025937 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:14:57.027226 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.027209 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.027308 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.027232 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:14:57.027933 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.027916 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:14:57.028001 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.027949 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:14:57.028001 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.027962 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:14:57.056277 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.056250 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-31.ec2.internal\" not found" node="ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.060502 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.060484 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-31.ec2.internal\" not found" node="ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.081378 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.081357 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.094608 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.094575 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/86015c3ccc333a84fef469be94c55ee1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal\" (UID: \"86015c3ccc333a84fef469be94c55ee1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.094608 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.094613 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86015c3ccc333a84fef469be94c55ee1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal\" (UID: \"86015c3ccc333a84fef469be94c55ee1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.094729 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.094634 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8afe0898337b196494e7d612c6fa17e1-config\") pod \"kube-apiserver-proxy-ip-10-0-133-31.ec2.internal\" (UID: \"8afe0898337b196494e7d612c6fa17e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.181667 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.181622 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.194972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.194952 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/86015c3ccc333a84fef469be94c55ee1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal\" (UID: \"86015c3ccc333a84fef469be94c55ee1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.195023 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.194979 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86015c3ccc333a84fef469be94c55ee1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal\" (UID: \"86015c3ccc333a84fef469be94c55ee1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.195023 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.194995 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8afe0898337b196494e7d612c6fa17e1-config\") pod \"kube-apiserver-proxy-ip-10-0-133-31.ec2.internal\" (UID: \"8afe0898337b196494e7d612c6fa17e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.195093 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.195023 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8afe0898337b196494e7d612c6fa17e1-config\") pod \"kube-apiserver-proxy-ip-10-0-133-31.ec2.internal\" (UID: \"8afe0898337b196494e7d612c6fa17e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.195136 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.195119 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/86015c3ccc333a84fef469be94c55ee1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal\" (UID: \"86015c3ccc333a84fef469be94c55ee1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.195182 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.195159 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86015c3ccc333a84fef469be94c55ee1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal\" (UID: \"86015c3ccc333a84fef469be94c55ee1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.282423 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.282386 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.357941 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.357904 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.363625 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.363607 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal" Apr 22 14:14:57.383149 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.383118 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.483749 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.483702 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.584244 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.584175 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.684725 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.684691 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.699071 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.699046 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:14:57.699209 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.699191 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:14:57.699272 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.699223 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:14:57.785149 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.785112 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.788273 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.788244 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:09:56 +0000 UTC" deadline="2027-11-27 02:57:38.541953349 +0000 UTC" Apr 22 14:14:57.788273 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.788270 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14004h42m40.753686335s" Apr 22 14:14:57.792738 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.792710 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:14:57.803995 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.803975 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:14:57.822931 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.822911 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kk7ck" Apr 22 14:14:57.831446 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.831426 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kk7ck" Apr 22 14:14:57.863593 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.863571 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:14:57.875000 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:57.874965 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86015c3ccc333a84fef469be94c55ee1.slice/crio-6570cf9b8848205aae8e88d0fd22fdecf29616a5a890f2f88c1a80796bfa9e63 WatchSource:0}: Error finding container 6570cf9b8848205aae8e88d0fd22fdecf29616a5a890f2f88c1a80796bfa9e63: Status 404 returned error can't find the container with id 6570cf9b8848205aae8e88d0fd22fdecf29616a5a890f2f88c1a80796bfa9e63 Apr 22 14:14:57.875225 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:57.875207 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8afe0898337b196494e7d612c6fa17e1.slice/crio-d9f9dd4be7626d26101c924be173a48635f06df5c3d10070eabc5f87df4d152c WatchSource:0}: Error finding container d9f9dd4be7626d26101c924be173a48635f06df5c3d10070eabc5f87df4d152c: Status 404 returned error can't find the container with id d9f9dd4be7626d26101c924be173a48635f06df5c3d10070eabc5f87df4d152c Apr 22 14:14:57.879500 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.879485 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:14:57.885251 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.885231 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:57.923032 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.922984 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal" event={"ID":"8afe0898337b196494e7d612c6fa17e1","Type":"ContainerStarted","Data":"d9f9dd4be7626d26101c924be173a48635f06df5c3d10070eabc5f87df4d152c"} Apr 22 14:14:57.925216 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:57.925196 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" event={"ID":"86015c3ccc333a84fef469be94c55ee1","Type":"ContainerStarted","Data":"6570cf9b8848205aae8e88d0fd22fdecf29616a5a890f2f88c1a80796bfa9e63"} Apr 22 14:14:57.986016 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:57.985982 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:58.086526 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:58.086493 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:58.187061 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:58.186985 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-31.ec2.internal\" not found" Apr 22 14:14:58.278621 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.277029 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:14:58.292732 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.292707 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" Apr 22 14:14:58.305997 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.305968 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:14:58.307024 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.307009 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal" Apr 22 14:14:58.313559 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.313539 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:14:58.769362 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.769330 2562 apiserver.go:52] "Watching apiserver" Apr 22 14:14:58.779751 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.779727 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:14:58.780146 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.780125 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-fwhsq","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5","openshift-cluster-node-tuning-operator/tuned-ghlf9","openshift-dns/node-resolver-jf64f","openshift-image-registry/node-ca-9t7jt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal","openshift-multus/multus-8tc5r","openshift-multus/multus-additional-cni-plugins-sl5cl","kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal","openshift-multus/network-metrics-daemon-7pz2p","openshift-network-diagnostics/network-check-target-8kfb7","openshift-network-operator/iptables-alerter-z56xg","openshift-ovn-kubernetes/ovnkube-node-k777w"] Apr 22 14:14:58.783014 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.782948 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.785271 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.785208 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.785711 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.785692 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:14:58.785806 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.785699 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:14:58.785806 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.785757 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-n4pfj\"" Apr 22 14:14:58.787436 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.787408 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:14:58.787597 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.787581 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:14:58.787867 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.787832 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:14:58.788168 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.788136 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:14:58.788168 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.788141 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:14:58.788307 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.788207 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:14:58.788307 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.788149 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sncpp\"" Apr 22 14:14:58.789180 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.789157 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:14:58.789274 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.789195 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:14:58.789758 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.789738 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.789758 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.789750 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:14:58.790260 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.790106 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-4n44t\"" Apr 22 14:14:58.790260 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.790251 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:14:58.792067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.792040 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.792217 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.792196 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:14:58.792311 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.792264 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:14:58.792588 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.792566 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:14:58.792682 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.792595 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-hqgwb\"" Apr 22 14:14:58.793068 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.792881 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:14:58.794329 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.794311 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.795002 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.794961 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:14:58.795002 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.794967 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:14:58.795144 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.795027 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ppfc2\"" Apr 22 14:14:58.795224 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.795198 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:14:58.797084 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.797066 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:58.798166 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.798148 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:14:58.798412 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.798399 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:14:58.798514 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.798497 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2b9zp\"" Apr 22 14:14:58.800224 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.800200 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:14:58.800358 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.800344 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:14:58.801083 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.800455 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dzcwp\"" Apr 22 14:14:58.801853 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.801760 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:14:58.801950 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:58.801846 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:14:58.803761 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.803740 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-socket-dir-parent\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.803846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.803775 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-slash\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.803846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.803799 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-cni-bin\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.803846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.803820 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-var-lib-kubelet\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.803846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.803842 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/daf625bc-3312-4340-b15d-afef34e3a313-tmp\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804052 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.803866 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw7dl\" (UniqueName: \"kubernetes.io/projected/daf625bc-3312-4340-b15d-afef34e3a313-kube-api-access-tw7dl\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804052 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.803922 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a50e9092-d980-437f-925d-016de60cc559-hosts-file\") pod \"node-resolver-jf64f\" (UID: \"a50e9092-d980-437f-925d-016de60cc559\") " pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.804052 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.803965 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524b05a6-377c-460c-a38e-359a1d04f304-ovnkube-config\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.804052 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804003 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-daemon-config\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.804052 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804033 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-etc-kubernetes\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.804300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804059 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-sysctl-d\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804102 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-sysctl-conf\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804144 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-run-ovn\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.804300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804173 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-system-cni-dir\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.804300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804200 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-run-multus-certs\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.804300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804232 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-systemd\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804239 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:14:58.804300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804262 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-systemd-units\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.804300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804287 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-run-openvswitch\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804311 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cc6398d2-2767-495e-b2e8-7f3f713e5a31-agent-certs\") pod \"konnectivity-agent-fwhsq\" (UID: \"cc6398d2-2767-495e-b2e8-7f3f713e5a31\") " pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804335 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ks86\" (UniqueName: \"kubernetes.io/projected/a50e9092-d980-437f-925d-016de60cc559-kube-api-access-4ks86\") pod \"node-resolver-jf64f\" (UID: \"a50e9092-d980-437f-925d-016de60cc559\") " pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:58.804343 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804358 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-run-netns\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804414 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-kubernetes\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804433 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-run\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804452 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/daf625bc-3312-4340-b15d-afef34e3a313-etc-tuned\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804471 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-sysconfig\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804491 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-sys\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804511 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-cni-binary-copy\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804526 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-hostroot\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804545 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804570 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524b05a6-377c-460c-a38e-359a1d04f304-env-overrides\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804594 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbnj\" (UniqueName: \"kubernetes.io/projected/524b05a6-377c-460c-a38e-359a1d04f304-kube-api-access-hzbnj\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804618 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-cnibin\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.804724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804640 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pbt4\" (UniqueName: \"kubernetes.io/projected/1ec369a9-6fa7-4522-ab1c-257f1ae32b8d-kube-api-access-6pbt4\") pod \"node-ca-9t7jt\" (UID: \"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d\") " pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804685 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-cni-netd\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804698 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-run-netns\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804713 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-var-lib-cni-bin\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804728 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-var-lib-cni-multus\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804773 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-conf-dir\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804795 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-run-ovn-kubernetes\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804809 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524b05a6-377c-460c-a38e-359a1d04f304-ovn-node-metrics-cert\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804833 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcbts\" (UniqueName: \"kubernetes.io/projected/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-kube-api-access-lcbts\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804850 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524b05a6-377c-460c-a38e-359a1d04f304-ovnkube-script-lib\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804866 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-var-lib-kubelet\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804881 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-modprobe-d\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804903 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ec369a9-6fa7-4522-ab1c-257f1ae32b8d-serviceca\") pod \"node-ca-9t7jt\" (UID: \"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d\") " pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804924 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-etc-openvswitch\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804938 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-run-k8s-cni-cncf-io\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804953 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-kubelet\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.804968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-log-socket\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.805442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805009 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-cni-dir\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.806025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805031 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-os-release\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.806025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805053 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ec369a9-6fa7-4522-ab1c-257f1ae32b8d-host\") pod \"node-ca-9t7jt\" (UID: \"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d\") " pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.806025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805068 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a50e9092-d980-437f-925d-016de60cc559-tmp-dir\") pod \"node-resolver-jf64f\" (UID: \"a50e9092-d980-437f-925d-016de60cc559\") " pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.806025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805083 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-run-systemd\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.806025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805097 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-var-lib-openvswitch\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.806025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805121 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-node-log\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.806025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805142 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-lib-modules\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.806025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805162 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-host\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.806025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.805184 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cc6398d2-2767-495e-b2e8-7f3f713e5a31-konnectivity-ca\") pod \"konnectivity-agent-fwhsq\" (UID: \"cc6398d2-2767-495e-b2e8-7f3f713e5a31\") " pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:14:58.806479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.806464 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:58.808910 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.808889 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:58.816200 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.816010 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:14:58.816200 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.816037 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:14:58.816200 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.816055 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:14:58.816200 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.816056 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:14:58.816200 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.816169 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-x9sz9\"" Apr 22 14:14:58.816495 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.816204 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:14:58.816495 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.816281 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:14:58.816495 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.816330 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-97skw\"" Apr 22 14:14:58.832159 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.832035 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:09:57 +0000 UTC" deadline="2027-12-28 14:22:46.469063062 +0000 UTC" Apr 22 14:14:58.832159 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.832058 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14760h7m47.637007901s" Apr 22 14:14:58.894180 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.894149 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:14:58.905977 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.905939 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-kubelet\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.905985 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-log-socket\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906005 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-cni-dir\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906029 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-os-release\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906050 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ec369a9-6fa7-4522-ab1c-257f1ae32b8d-host\") pod \"node-ca-9t7jt\" (UID: \"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d\") " pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906056 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-kubelet\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906070 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a50e9092-d980-437f-925d-016de60cc559-tmp-dir\") pod \"node-resolver-jf64f\" (UID: \"a50e9092-d980-437f-925d-016de60cc559\") " pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906111 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-run-systemd\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906124 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-log-socket\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906140 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-var-lib-openvswitch\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906128 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ec369a9-6fa7-4522-ab1c-257f1ae32b8d-host\") pod \"node-ca-9t7jt\" (UID: \"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d\") " pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906184 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-cni-dir\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906194 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-os-release\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906216 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-node-log\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906230 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-var-lib-openvswitch\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906227 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-run-systemd\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906259 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-lib-modules\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906275 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-node-log\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906288 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-host\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906326 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cc6398d2-2767-495e-b2e8-7f3f713e5a31-konnectivity-ca\") pod \"konnectivity-agent-fwhsq\" (UID: \"cc6398d2-2767-495e-b2e8-7f3f713e5a31\") " pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906353 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-socket-dir-parent\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906370 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-host\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906383 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0212ebfc-c697-40e1-8939-863a200bf32a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906413 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0212ebfc-c697-40e1-8939-863a200bf32a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906412 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-lib-modules\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906444 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-socket-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906487 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-socket-dir-parent\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.906623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906501 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-slash\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906504 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a50e9092-d980-437f-925d-016de60cc559-tmp-dir\") pod \"node-resolver-jf64f\" (UID: \"a50e9092-d980-437f-925d-016de60cc559\") " pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906527 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-cni-bin\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906556 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-var-lib-kubelet\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906579 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-slash\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906580 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/daf625bc-3312-4340-b15d-afef34e3a313-tmp\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906614 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-cni-bin\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906624 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw7dl\" (UniqueName: \"kubernetes.io/projected/daf625bc-3312-4340-b15d-afef34e3a313-kube-api-access-tw7dl\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906619 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-var-lib-kubelet\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906671 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a50e9092-d980-437f-925d-016de60cc559-hosts-file\") pod \"node-resolver-jf64f\" (UID: \"a50e9092-d980-437f-925d-016de60cc559\") " pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906720 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524b05a6-377c-460c-a38e-359a1d04f304-ovnkube-config\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906743 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-daemon-config\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906766 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-etc-kubernetes\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906789 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-sysctl-d\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906815 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-sysctl-conf\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906791 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a50e9092-d980-437f-925d-016de60cc559-hosts-file\") pod \"node-resolver-jf64f\" (UID: \"a50e9092-d980-437f-925d-016de60cc559\") " pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906871 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-run-ovn\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906901 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:14:58.907418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906927 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-system-cni-dir\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906952 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cc6398d2-2767-495e-b2e8-7f3f713e5a31-konnectivity-ca\") pod \"konnectivity-agent-fwhsq\" (UID: \"cc6398d2-2767-495e-b2e8-7f3f713e5a31\") " pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906951 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-run-multus-certs\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906995 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-systemd\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907029 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad3bd840-a967-40e3-9669-959790f9dfb8-host-slash\") pod \"iptables-alerter-z56xg\" (UID: \"ad3bd840-a967-40e3-9669-959790f9dfb8\") " pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907037 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-system-cni-dir\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907051 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907080 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-run-ovn\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906998 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-etc-kubernetes\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907073 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-systemd-units\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907122 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-systemd\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907130 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-sysctl-conf\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.906970 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-sysctl-d\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907157 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-systemd-units\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907168 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-run-openvswitch\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907194 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-run-multus-certs\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907250 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-run-openvswitch\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907304 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cc6398d2-2767-495e-b2e8-7f3f713e5a31-agent-certs\") pod \"konnectivity-agent-fwhsq\" (UID: \"cc6398d2-2767-495e-b2e8-7f3f713e5a31\") " pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:14:58.908268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907345 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phg5s\" (UniqueName: \"kubernetes.io/projected/db11d8cb-718e-49f4-a019-bc36f8a9af79-kube-api-access-phg5s\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907374 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-system-cni-dir\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907408 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ks86\" (UniqueName: \"kubernetes.io/projected/a50e9092-d980-437f-925d-016de60cc559-kube-api-access-4ks86\") pod \"node-resolver-jf64f\" (UID: \"a50e9092-d980-437f-925d-016de60cc559\") " pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907435 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-run-netns\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907487 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-kubernetes\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907491 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-daemon-config\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907533 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/524b05a6-377c-460c-a38e-359a1d04f304-ovnkube-config\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907598 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-run-netns\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907610 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-run\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907666 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-kubernetes\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907701 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/daf625bc-3312-4340-b15d-afef34e3a313-etc-tuned\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907709 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-run\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907731 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-sysconfig\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907761 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-sys\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907810 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-sysconfig\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907852 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-cni-binary-copy\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907860 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-sys\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907946 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-hostroot\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907987 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxkt6\" (UniqueName: \"kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6\") pod \"network-check-target-8kfb7\" (UID: \"58472e4a-b808-4034-b26a-9ab40a3074ec\") " pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.907991 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-hostroot\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908027 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908052 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-sys-fs\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908078 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908114 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524b05a6-377c-460c-a38e-359a1d04f304-env-overrides\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908137 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzbnj\" (UniqueName: \"kubernetes.io/projected/524b05a6-377c-460c-a38e-359a1d04f304-kube-api-access-hzbnj\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908181 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-cnibin\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908204 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pbt4\" (UniqueName: \"kubernetes.io/projected/1ec369a9-6fa7-4522-ab1c-257f1ae32b8d-kube-api-access-6pbt4\") pod \"node-ca-9t7jt\" (UID: \"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d\") " pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908246 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-cni-netd\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908292 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-run-netns\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908314 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-var-lib-cni-bin\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908330 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-cni-binary-copy\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908389 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-cni-netd\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908391 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-cnibin\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908420 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-var-lib-cni-multus\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908433 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-run-netns\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908444 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-conf-dir\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.909954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908478 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-run-ovn-kubernetes\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908488 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-var-lib-cni-multus\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908501 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524b05a6-377c-460c-a38e-359a1d04f304-ovn-node-metrics-cert\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908524 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcbts\" (UniqueName: \"kubernetes.io/projected/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-kube-api-access-lcbts\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908528 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-var-lib-cni-bin\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908574 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-multus-conf-dir\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908576 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbxr\" (UniqueName: \"kubernetes.io/projected/0212ebfc-c697-40e1-8939-863a200bf32a-kube-api-access-gsbxr\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908596 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908619 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-host-run-ovn-kubernetes\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908608 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-device-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908680 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524b05a6-377c-460c-a38e-359a1d04f304-ovnkube-script-lib\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908703 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-var-lib-kubelet\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908744 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sznwq\" (UniqueName: \"kubernetes.io/projected/ad3bd840-a967-40e3-9669-959790f9dfb8-kube-api-access-sznwq\") pod \"iptables-alerter-z56xg\" (UID: \"ad3bd840-a967-40e3-9669-959790f9dfb8\") " pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908777 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/524b05a6-377c-460c-a38e-359a1d04f304-env-overrides\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908781 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-os-release\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908833 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-var-lib-kubelet\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908841 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-registration-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:58.910820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908899 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-modprobe-d\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908929 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-cnibin\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.908991 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ec369a9-6fa7-4522-ab1c-257f1ae32b8d-serviceca\") pod \"node-ca-9t7jt\" (UID: \"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d\") " pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909022 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ad3bd840-a967-40e3-9669-959790f9dfb8-iptables-alerter-script\") pod \"iptables-alerter-z56xg\" (UID: \"ad3bd840-a967-40e3-9669-959790f9dfb8\") " pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909049 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0212ebfc-c697-40e1-8939-863a200bf32a-cni-binary-copy\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909053 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/daf625bc-3312-4340-b15d-afef34e3a313-etc-modprobe-d\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909074 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-etc-openvswitch\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909110 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-run-k8s-cni-cncf-io\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909121 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/524b05a6-377c-460c-a38e-359a1d04f304-etc-openvswitch\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909137 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909163 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-host-run-k8s-cni-cncf-io\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909161 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/524b05a6-377c-460c-a38e-359a1d04f304-ovnkube-script-lib\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909163 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-etc-selinux\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909234 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnwgb\" (UniqueName: \"kubernetes.io/projected/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-kube-api-access-gnwgb\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.909935 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ec369a9-6fa7-4522-ab1c-257f1ae32b8d-serviceca\") pod \"node-ca-9t7jt\" (UID: \"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d\") " pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.910754 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/daf625bc-3312-4340-b15d-afef34e3a313-etc-tuned\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.910809 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/daf625bc-3312-4340-b15d-afef34e3a313-tmp\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:58.911530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.911378 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/524b05a6-377c-460c-a38e-359a1d04f304-ovn-node-metrics-cert\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.912258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.911593 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cc6398d2-2767-495e-b2e8-7f3f713e5a31-agent-certs\") pod \"konnectivity-agent-fwhsq\" (UID: \"cc6398d2-2767-495e-b2e8-7f3f713e5a31\") " pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:14:58.935003 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.934965 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcbts\" (UniqueName: \"kubernetes.io/projected/25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd-kube-api-access-lcbts\") pod \"multus-8tc5r\" (UID: \"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd\") " pod="openshift-multus/multus-8tc5r" Apr 22 14:14:58.935493 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.935469 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ks86\" (UniqueName: \"kubernetes.io/projected/a50e9092-d980-437f-925d-016de60cc559-kube-api-access-4ks86\") pod \"node-resolver-jf64f\" (UID: \"a50e9092-d980-437f-925d-016de60cc559\") " pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:58.935493 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.935472 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pbt4\" (UniqueName: \"kubernetes.io/projected/1ec369a9-6fa7-4522-ab1c-257f1ae32b8d-kube-api-access-6pbt4\") pod \"node-ca-9t7jt\" (UID: \"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d\") " pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:58.935686 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.935637 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzbnj\" (UniqueName: \"kubernetes.io/projected/524b05a6-377c-460c-a38e-359a1d04f304-kube-api-access-hzbnj\") pod \"ovnkube-node-k777w\" (UID: \"524b05a6-377c-460c-a38e-359a1d04f304\") " pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:58.936387 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:58.936372 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw7dl\" (UniqueName: \"kubernetes.io/projected/daf625bc-3312-4340-b15d-afef34e3a313-kube-api-access-tw7dl\") pod \"tuned-ghlf9\" (UID: \"daf625bc-3312-4340-b15d-afef34e3a313\") " pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:59.009533 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009498 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad3bd840-a967-40e3-9669-959790f9dfb8-host-slash\") pod \"iptables-alerter-z56xg\" (UID: \"ad3bd840-a967-40e3-9669-959790f9dfb8\") " pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009543 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009571 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phg5s\" (UniqueName: \"kubernetes.io/projected/db11d8cb-718e-49f4-a019-bc36f8a9af79-kube-api-access-phg5s\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009598 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-system-cni-dir\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009618 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkt6\" (UniqueName: \"kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6\") pod \"network-check-target-8kfb7\" (UID: \"58472e4a-b808-4034-b26a-9ab40a3074ec\") " pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009618 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad3bd840-a967-40e3-9669-959790f9dfb8-host-slash\") pod \"iptables-alerter-z56xg\" (UID: \"ad3bd840-a967-40e3-9669-959790f9dfb8\") " pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009640 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009678 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-sys-fs\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009702 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-system-cni-dir\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009711 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbxr\" (UniqueName: \"kubernetes.io/projected/0212ebfc-c697-40e1-8939-863a200bf32a-kube-api-access-gsbxr\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.009745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009739 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-device-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009763 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sznwq\" (UniqueName: \"kubernetes.io/projected/ad3bd840-a967-40e3-9669-959790f9dfb8-kube-api-access-sznwq\") pod \"iptables-alerter-z56xg\" (UID: \"ad3bd840-a967-40e3-9669-959790f9dfb8\") " pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009778 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-sys-fs\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009714 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009787 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-os-release\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009824 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-registration-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009848 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-cnibin\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009872 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ad3bd840-a967-40e3-9669-959790f9dfb8-iptables-alerter-script\") pod \"iptables-alerter-z56xg\" (UID: \"ad3bd840-a967-40e3-9669-959790f9dfb8\") " pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009883 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009895 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0212ebfc-c697-40e1-8939-863a200bf32a-cni-binary-copy\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009910 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-os-release\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009923 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009845 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-device-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009946 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-etc-selinux\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009972 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnwgb\" (UniqueName: \"kubernetes.io/projected/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-kube-api-access-gnwgb\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009974 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0212ebfc-c697-40e1-8939-863a200bf32a-cnibin\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.010020 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0212ebfc-c697-40e1-8939-863a200bf32a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.010293 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.010033 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:14:59.010972 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.010098 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs podName:db11d8cb-718e-49f4-a019-bc36f8a9af79 nodeName:}" failed. No retries permitted until 2026-04-22 14:14:59.510074533 +0000 UTC m=+3.209435194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs") pod "network-metrics-daemon-7pz2p" (UID: "db11d8cb-718e-49f4-a019-bc36f8a9af79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:14:59.010972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.010124 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-etc-selinux\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.009943 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-registration-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.010159 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0212ebfc-c697-40e1-8939-863a200bf32a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.010972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.010190 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-socket-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.010309 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-socket-dir\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.010972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.010404 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ad3bd840-a967-40e3-9669-959790f9dfb8-iptables-alerter-script\") pod \"iptables-alerter-z56xg\" (UID: \"ad3bd840-a967-40e3-9669-959790f9dfb8\") " pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:59.010972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.010521 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0212ebfc-c697-40e1-8939-863a200bf32a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.010972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.010640 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0212ebfc-c697-40e1-8939-863a200bf32a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.011366 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.010982 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0212ebfc-c697-40e1-8939-863a200bf32a-cni-binary-copy\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.018849 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.018828 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:14:59.018849 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.018847 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:14:59.018990 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.018857 2562 projected.go:194] Error preparing data for projected volume kube-api-access-zxkt6 for pod openshift-network-diagnostics/network-check-target-8kfb7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:14:59.018990 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.018912 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6 podName:58472e4a-b808-4034-b26a-9ab40a3074ec nodeName:}" failed. No retries permitted until 2026-04-22 14:14:59.51889763 +0000 UTC m=+3.218258310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zxkt6" (UniqueName: "kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6") pod "network-check-target-8kfb7" (UID: "58472e4a-b808-4034-b26a-9ab40a3074ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:14:59.024159 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.024108 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sznwq\" (UniqueName: \"kubernetes.io/projected/ad3bd840-a967-40e3-9669-959790f9dfb8-kube-api-access-sznwq\") pod \"iptables-alerter-z56xg\" (UID: \"ad3bd840-a967-40e3-9669-959790f9dfb8\") " pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:59.024261 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.024214 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbxr\" (UniqueName: \"kubernetes.io/projected/0212ebfc-c697-40e1-8939-863a200bf32a-kube-api-access-gsbxr\") pod \"multus-additional-cni-plugins-sl5cl\" (UID: \"0212ebfc-c697-40e1-8939-863a200bf32a\") " pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.024575 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.024555 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phg5s\" (UniqueName: \"kubernetes.io/projected/db11d8cb-718e-49f4-a019-bc36f8a9af79-kube-api-access-phg5s\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:14:59.024861 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.024841 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnwgb\" (UniqueName: \"kubernetes.io/projected/63b3e3a3-eb48-431a-a1ca-d455b05ea91c-kube-api-access-gnwgb\") pod \"aws-ebs-csi-driver-node-qxwc5\" (UID: \"63b3e3a3-eb48-431a-a1ca-d455b05ea91c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.095086 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.095055 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jf64f" Apr 22 14:14:59.103873 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.103853 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:14:59.112622 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.112604 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:14:59.116208 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.116181 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8tc5r" Apr 22 14:14:59.122700 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.122678 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9t7jt" Apr 22 14:14:59.130220 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.130199 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" Apr 22 14:14:59.135754 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.135733 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" Apr 22 14:14:59.143272 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.143251 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z56xg" Apr 22 14:14:59.148836 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.148807 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" Apr 22 14:14:59.268695 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.268664 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:14:59.457701 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:59.457676 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d005dc_d20d_43ae_bb7b_1c3a14bd5ddd.slice/crio-6679e8043705ed7b27ae827925126c2a7c441e8cc3d96ad2ffa2c336c85d6ce6 WatchSource:0}: Error finding container 6679e8043705ed7b27ae827925126c2a7c441e8cc3d96ad2ffa2c336c85d6ce6: Status 404 returned error can't find the container with id 6679e8043705ed7b27ae827925126c2a7c441e8cc3d96ad2ffa2c336c85d6ce6 Apr 22 14:14:59.459134 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:59.459043 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ec369a9_6fa7_4522_ab1c_257f1ae32b8d.slice/crio-966c549e5a3ec00fc3f3c1f8f56eb0559db905824f1becd738e540ddbebfedab WatchSource:0}: Error finding container 966c549e5a3ec00fc3f3c1f8f56eb0559db905824f1becd738e540ddbebfedab: Status 404 returned error can't find the container with id 966c549e5a3ec00fc3f3c1f8f56eb0559db905824f1becd738e540ddbebfedab Apr 22 14:14:59.460561 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:59.460535 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63b3e3a3_eb48_431a_a1ca_d455b05ea91c.slice/crio-413b6eb66f04f3dc07f5b598929e3b6fe8016849073369afa042fbba412f6fb7 WatchSource:0}: Error finding container 413b6eb66f04f3dc07f5b598929e3b6fe8016849073369afa042fbba412f6fb7: Status 404 returned error can't find the container with id 413b6eb66f04f3dc07f5b598929e3b6fe8016849073369afa042fbba412f6fb7 Apr 22 14:14:59.461952 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:59.461929 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaf625bc_3312_4340_b15d_afef34e3a313.slice/crio-80ff4b6cae704e49ac1858719267e115884ab73ed835a89119d0e9be02329748 WatchSource:0}: Error finding container 80ff4b6cae704e49ac1858719267e115884ab73ed835a89119d0e9be02329748: Status 404 returned error can't find the container with id 80ff4b6cae704e49ac1858719267e115884ab73ed835a89119d0e9be02329748 Apr 22 14:14:59.463909 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:59.463806 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc6398d2_2767_495e_b2e8_7f3f713e5a31.slice/crio-0caf07fdbe84e90fb6257761b92fbd52672c68d0fb8fcf58878dc3e8800cff8e WatchSource:0}: Error finding container 0caf07fdbe84e90fb6257761b92fbd52672c68d0fb8fcf58878dc3e8800cff8e: Status 404 returned error can't find the container with id 0caf07fdbe84e90fb6257761b92fbd52672c68d0fb8fcf58878dc3e8800cff8e Apr 22 14:14:59.464534 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:59.464463 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524b05a6_377c_460c_a38e_359a1d04f304.slice/crio-06f996fd71777d5141830029010f7c84d0c6e0b6c388955a9b030c17c778cddd WatchSource:0}: Error finding container 06f996fd71777d5141830029010f7c84d0c6e0b6c388955a9b030c17c778cddd: Status 404 returned error can't find the container with id 06f996fd71777d5141830029010f7c84d0c6e0b6c388955a9b030c17c778cddd Apr 22 14:14:59.486014 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:59.485995 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0212ebfc_c697_40e1_8939_863a200bf32a.slice/crio-efda3803a92567d65286c9923f5f3aef59ac5b4349bbc7d714dcc852434b0b72 WatchSource:0}: Error finding container efda3803a92567d65286c9923f5f3aef59ac5b4349bbc7d714dcc852434b0b72: Status 404 returned error can't find the container with id efda3803a92567d65286c9923f5f3aef59ac5b4349bbc7d714dcc852434b0b72 Apr 22 14:14:59.487290 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:59.487273 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad3bd840_a967_40e3_9669_959790f9dfb8.slice/crio-cd1786582c206b7f7272d29d61e64412645f148f958cabb742912485eb8b7063 WatchSource:0}: Error finding container cd1786582c206b7f7272d29d61e64412645f148f958cabb742912485eb8b7063: Status 404 returned error can't find the container with id cd1786582c206b7f7272d29d61e64412645f148f958cabb742912485eb8b7063 Apr 22 14:14:59.489380 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:14:59.489351 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda50e9092_d980_437f_925d_016de60cc559.slice/crio-75b0c41c6e52068c0c7863ca869703feb521e96a33689013538f88ac0ade30b7 WatchSource:0}: Error finding container 75b0c41c6e52068c0c7863ca869703feb521e96a33689013538f88ac0ade30b7: Status 404 returned error can't find the container with id 75b0c41c6e52068c0c7863ca869703feb521e96a33689013538f88ac0ade30b7 Apr 22 14:14:59.514086 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.513926 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:14:59.514179 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.514060 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:14:59.514230 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.514188 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs podName:db11d8cb-718e-49f4-a019-bc36f8a9af79 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:00.514168322 +0000 UTC m=+4.213528995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs") pod "network-metrics-daemon-7pz2p" (UID: "db11d8cb-718e-49f4-a019-bc36f8a9af79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:14:59.615035 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.615003 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkt6\" (UniqueName: \"kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6\") pod \"network-check-target-8kfb7\" (UID: \"58472e4a-b808-4034-b26a-9ab40a3074ec\") " pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:14:59.615163 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.615115 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:14:59.615163 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.615128 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:14:59.615163 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.615137 2562 projected.go:194] Error preparing data for projected volume kube-api-access-zxkt6 for pod openshift-network-diagnostics/network-check-target-8kfb7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:14:59.615292 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:14:59.615188 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6 podName:58472e4a-b808-4034-b26a-9ab40a3074ec nodeName:}" failed. No retries permitted until 2026-04-22 14:15:00.615167404 +0000 UTC m=+4.314528064 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxkt6" (UniqueName: "kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6") pod "network-check-target-8kfb7" (UID: "58472e4a-b808-4034-b26a-9ab40a3074ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:14:59.833094 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.833060 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:09:57 +0000 UTC" deadline="2027-12-13 09:12:49.135864071 +0000 UTC" Apr 22 14:14:59.833094 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.833091 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14394h57m49.302775773s" Apr 22 14:14:59.936416 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.936350 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fwhsq" event={"ID":"cc6398d2-2767-495e-b2e8-7f3f713e5a31","Type":"ContainerStarted","Data":"0caf07fdbe84e90fb6257761b92fbd52672c68d0fb8fcf58878dc3e8800cff8e"} Apr 22 14:14:59.940023 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.939374 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal" event={"ID":"8afe0898337b196494e7d612c6fa17e1","Type":"ContainerStarted","Data":"4b358c90da039a109204fe6d0b0bfaa552ed24ea8bec92c78960c3e64833220c"} Apr 22 14:14:59.941755 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.941726 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" event={"ID":"0212ebfc-c697-40e1-8939-863a200bf32a","Type":"ContainerStarted","Data":"efda3803a92567d65286c9923f5f3aef59ac5b4349bbc7d714dcc852434b0b72"} Apr 22 14:14:59.947183 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.947134 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9t7jt" event={"ID":"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d","Type":"ContainerStarted","Data":"966c549e5a3ec00fc3f3c1f8f56eb0559db905824f1becd738e540ddbebfedab"} Apr 22 14:14:59.951019 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.950995 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" event={"ID":"daf625bc-3312-4340-b15d-afef34e3a313","Type":"ContainerStarted","Data":"80ff4b6cae704e49ac1858719267e115884ab73ed835a89119d0e9be02329748"} Apr 22 14:14:59.957098 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.957037 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" event={"ID":"63b3e3a3-eb48-431a-a1ca-d455b05ea91c","Type":"ContainerStarted","Data":"413b6eb66f04f3dc07f5b598929e3b6fe8016849073369afa042fbba412f6fb7"} Apr 22 14:14:59.960983 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.960947 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8tc5r" event={"ID":"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd","Type":"ContainerStarted","Data":"6679e8043705ed7b27ae827925126c2a7c441e8cc3d96ad2ffa2c336c85d6ce6"} Apr 22 14:14:59.964216 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.964169 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jf64f" event={"ID":"a50e9092-d980-437f-925d-016de60cc559","Type":"ContainerStarted","Data":"75b0c41c6e52068c0c7863ca869703feb521e96a33689013538f88ac0ade30b7"} Apr 22 14:14:59.966542 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.966509 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z56xg" event={"ID":"ad3bd840-a967-40e3-9669-959790f9dfb8","Type":"ContainerStarted","Data":"cd1786582c206b7f7272d29d61e64412645f148f958cabb742912485eb8b7063"} Apr 22 14:14:59.971523 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:14:59.971102 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"06f996fd71777d5141830029010f7c84d0c6e0b6c388955a9b030c17c778cddd"} Apr 22 14:15:00.128458 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.128320 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-31.ec2.internal" podStartSLOduration=2.128296297 podStartE2EDuration="2.128296297s" podCreationTimestamp="2026-04-22 14:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:14:59.953915084 +0000 UTC m=+3.653275767" watchObservedRunningTime="2026-04-22 14:15:00.128296297 +0000 UTC m=+3.827656981" Apr 22 14:15:00.129382 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.129358 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kcb7w"] Apr 22 14:15:00.132218 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.132193 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.132311 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.132270 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:00.220672 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.220587 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1b5088f9-42ba-4937-95a9-db3577261f8f-dbus\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.220851 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.220706 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1b5088f9-42ba-4937-95a9-db3577261f8f-kubelet-config\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.220851 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.220740 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.321302 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.321262 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1b5088f9-42ba-4937-95a9-db3577261f8f-dbus\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.321469 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.321361 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1b5088f9-42ba-4937-95a9-db3577261f8f-kubelet-config\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.321469 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.321394 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.321579 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.321543 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:00.321632 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.321606 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret podName:1b5088f9-42ba-4937-95a9-db3577261f8f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:00.82158763 +0000 UTC m=+4.520948298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret") pod "global-pull-secret-syncer-kcb7w" (UID: "1b5088f9-42ba-4937-95a9-db3577261f8f") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:00.322150 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.321960 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1b5088f9-42ba-4937-95a9-db3577261f8f-dbus\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.322150 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.322035 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1b5088f9-42ba-4937-95a9-db3577261f8f-kubelet-config\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.522832 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.522793 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:00.523009 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.522953 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:00.523076 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.523018 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs podName:db11d8cb-718e-49f4-a019-bc36f8a9af79 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:02.522999087 +0000 UTC m=+6.222359752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs") pod "network-metrics-daemon-7pz2p" (UID: "db11d8cb-718e-49f4-a019-bc36f8a9af79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:00.623634 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.623592 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkt6\" (UniqueName: \"kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6\") pod \"network-check-target-8kfb7\" (UID: \"58472e4a-b808-4034-b26a-9ab40a3074ec\") " pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:00.624335 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.623852 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:00.624335 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.623878 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:00.624335 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.623892 2562 projected.go:194] Error preparing data for projected volume kube-api-access-zxkt6 for pod openshift-network-diagnostics/network-check-target-8kfb7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:00.624335 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.623951 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6 podName:58472e4a-b808-4034-b26a-9ab40a3074ec nodeName:}" failed. No retries permitted until 2026-04-22 14:15:02.623931638 +0000 UTC m=+6.323292318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxkt6" (UniqueName: "kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6") pod "network-check-target-8kfb7" (UID: "58472e4a-b808-4034-b26a-9ab40a3074ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:00.826194 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.825480 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:00.826194 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.825699 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:00.826194 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.825761 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret podName:1b5088f9-42ba-4937-95a9-db3577261f8f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:01.825743936 +0000 UTC m=+5.525104599 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret") pod "global-pull-secret-syncer-kcb7w" (UID: "1b5088f9-42ba-4937-95a9-db3577261f8f") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:00.921844 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.921813 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:00.922358 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.921866 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:00.922358 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.921959 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:00.922358 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:00.922076 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:00.993254 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.993220 2562 generic.go:358] "Generic (PLEG): container finished" podID="86015c3ccc333a84fef469be94c55ee1" containerID="ba85fd37c50f319695eaefc2a4245e33670ebca1a42d5382d58aa73e489d6cae" exitCode=0 Apr 22 14:15:00.993444 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:00.993379 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" event={"ID":"86015c3ccc333a84fef469be94c55ee1","Type":"ContainerDied","Data":"ba85fd37c50f319695eaefc2a4245e33670ebca1a42d5382d58aa73e489d6cae"} Apr 22 14:15:01.834830 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:01.834206 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:01.834830 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:01.834399 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:01.834830 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:01.834472 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret podName:1b5088f9-42ba-4937-95a9-db3577261f8f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:03.83444402 +0000 UTC m=+7.533804708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret") pod "global-pull-secret-syncer-kcb7w" (UID: "1b5088f9-42ba-4937-95a9-db3577261f8f") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:01.921247 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:01.921214 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:01.921411 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:01.921348 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:02.004242 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:02.004202 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" event={"ID":"86015c3ccc333a84fef469be94c55ee1","Type":"ContainerStarted","Data":"f1996d1b0a37639711f4c3167c83cc666962c05c0f7ddf2eab4ff7db751bfb27"} Apr 22 14:15:02.540607 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:02.539879 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:02.540607 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:02.540047 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:02.540607 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:02.540133 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs podName:db11d8cb-718e-49f4-a019-bc36f8a9af79 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:06.540114161 +0000 UTC m=+10.239474827 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs") pod "network-metrics-daemon-7pz2p" (UID: "db11d8cb-718e-49f4-a019-bc36f8a9af79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:02.641311 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:02.640621 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkt6\" (UniqueName: \"kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6\") pod \"network-check-target-8kfb7\" (UID: \"58472e4a-b808-4034-b26a-9ab40a3074ec\") " pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:02.641311 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:02.640846 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:02.641311 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:02.640869 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:02.641311 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:02.640882 2562 projected.go:194] Error preparing data for projected volume kube-api-access-zxkt6 for pod openshift-network-diagnostics/network-check-target-8kfb7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:02.641311 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:02.640942 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6 podName:58472e4a-b808-4034-b26a-9ab40a3074ec nodeName:}" failed. No retries permitted until 2026-04-22 14:15:06.640923586 +0000 UTC m=+10.340284262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxkt6" (UniqueName: "kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6") pod "network-check-target-8kfb7" (UID: "58472e4a-b808-4034-b26a-9ab40a3074ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:02.922923 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:02.922858 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:02.923091 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:02.922991 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:02.923091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:02.923005 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:02.923217 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:02.923110 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:03.849008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:03.848913 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:03.849475 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:03.849085 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:03.849475 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:03.849167 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret podName:1b5088f9-42ba-4937-95a9-db3577261f8f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:07.849145927 +0000 UTC m=+11.548506589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret") pod "global-pull-secret-syncer-kcb7w" (UID: "1b5088f9-42ba-4937-95a9-db3577261f8f") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:03.920822 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:03.920769 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:03.921101 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:03.921068 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:04.922968 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:04.922935 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:04.923433 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:04.923076 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:04.923512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:04.923493 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:04.923597 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:04.923577 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:05.920664 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:05.920615 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:05.920825 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:05.920772 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:06.574325 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:06.573854 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:06.574325 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:06.574037 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:06.574325 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:06.574142 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs podName:db11d8cb-718e-49f4-a019-bc36f8a9af79 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:14.574120565 +0000 UTC m=+18.273481239 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs") pod "network-metrics-daemon-7pz2p" (UID: "db11d8cb-718e-49f4-a019-bc36f8a9af79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:06.675124 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:06.675080 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkt6\" (UniqueName: \"kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6\") pod \"network-check-target-8kfb7\" (UID: \"58472e4a-b808-4034-b26a-9ab40a3074ec\") " pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:06.675316 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:06.675296 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:06.675316 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:06.675316 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:06.675427 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:06.675329 2562 projected.go:194] Error preparing data for projected volume kube-api-access-zxkt6 for pod openshift-network-diagnostics/network-check-target-8kfb7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:06.675427 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:06.675387 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6 podName:58472e4a-b808-4034-b26a-9ab40a3074ec nodeName:}" failed. No retries permitted until 2026-04-22 14:15:14.675367812 +0000 UTC m=+18.374728517 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxkt6" (UniqueName: "kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6") pod "network-check-target-8kfb7" (UID: "58472e4a-b808-4034-b26a-9ab40a3074ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:06.922079 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:06.921597 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:06.922079 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:06.921736 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:06.922079 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:06.922046 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:06.922356 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:06.922158 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:07.886409 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:07.886342 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:07.886893 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:07.886496 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:07.886893 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:07.886564 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret podName:1b5088f9-42ba-4937-95a9-db3577261f8f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:15.886543262 +0000 UTC m=+19.585903925 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret") pod "global-pull-secret-syncer-kcb7w" (UID: "1b5088f9-42ba-4937-95a9-db3577261f8f") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:07.920692 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:07.920630 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:07.920984 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:07.920777 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:08.920667 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:08.920622 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:08.920667 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:08.920667 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:08.921177 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:08.920782 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:08.921177 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:08.920854 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:09.920902 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:09.920868 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:09.921346 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:09.920980 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:10.920958 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:10.920916 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:10.921407 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:10.920927 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:10.921407 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:10.921070 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:10.921407 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:10.921094 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:11.920843 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:11.920812 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:11.921035 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:11.920935 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:12.920887 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:12.920848 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:12.921073 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:12.920986 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:12.921073 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:12.920859 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:12.921465 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:12.921098 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:13.920853 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:13.920813 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:13.921004 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:13.920946 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:14.634381 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:14.634345 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:14.634747 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:14.634505 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:14.634747 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:14.634571 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs podName:db11d8cb-718e-49f4-a019-bc36f8a9af79 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:30.6345553 +0000 UTC m=+34.333915960 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs") pod "network-metrics-daemon-7pz2p" (UID: "db11d8cb-718e-49f4-a019-bc36f8a9af79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:14.735584 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:14.735550 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkt6\" (UniqueName: \"kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6\") pod \"network-check-target-8kfb7\" (UID: \"58472e4a-b808-4034-b26a-9ab40a3074ec\") " pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:14.735754 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:14.735731 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:14.735754 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:14.735748 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:14.735833 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:14.735757 2562 projected.go:194] Error preparing data for projected volume kube-api-access-zxkt6 for pod openshift-network-diagnostics/network-check-target-8kfb7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:14.735833 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:14.735814 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6 podName:58472e4a-b808-4034-b26a-9ab40a3074ec nodeName:}" failed. No retries permitted until 2026-04-22 14:15:30.735797626 +0000 UTC m=+34.435158288 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxkt6" (UniqueName: "kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6") pod "network-check-target-8kfb7" (UID: "58472e4a-b808-4034-b26a-9ab40a3074ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:14.921364 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:14.921285 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:14.921364 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:14.921320 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:14.921535 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:14.921439 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:14.921579 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:14.921538 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:15.921265 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:15.921234 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:15.921762 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:15.921335 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:15.945463 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:15.945431 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:15.945629 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:15.945593 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:15.945698 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:15.945685 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret podName:1b5088f9-42ba-4937-95a9-db3577261f8f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:31.945664009 +0000 UTC m=+35.645024688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret") pod "global-pull-secret-syncer-kcb7w" (UID: "1b5088f9-42ba-4937-95a9-db3577261f8f") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:16.922040 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:16.921843 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:16.922542 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:16.921914 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:16.922542 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:16.922141 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:16.922542 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:16.922186 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:17.030710 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.030682 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9t7jt" event={"ID":"1ec369a9-6fa7-4522-ab1c-257f1ae32b8d","Type":"ContainerStarted","Data":"0390d40ca6f02e937cbe2e77ce4a8aeba93b87897ad0d8f89aabb55f274ad488"} Apr 22 14:15:17.031920 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.031895 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" event={"ID":"daf625bc-3312-4340-b15d-afef34e3a313","Type":"ContainerStarted","Data":"e464ce4573824ec94cac0aa7e0460faac92818d804fe9a61129e1a4aa9c76bad"} Apr 22 14:15:17.033092 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.033072 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" event={"ID":"63b3e3a3-eb48-431a-a1ca-d455b05ea91c","Type":"ContainerStarted","Data":"7ec2987158d33ea33ed4d446e9d46f210ce2f0a0f94d55f49833e106f321ef5d"} Apr 22 14:15:17.034267 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.034245 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8tc5r" event={"ID":"25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd","Type":"ContainerStarted","Data":"78d3c8acbe7c245ea8bf2314fe7360c3ea2821539a2142bf81b85dc72553e3f0"} Apr 22 14:15:17.035615 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.035593 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jf64f" event={"ID":"a50e9092-d980-437f-925d-016de60cc559","Type":"ContainerStarted","Data":"9c770363ea1a163454ad64c58986599d4f6148b8f93f00cc7f74f2d673d5d82b"} Apr 22 14:15:17.036970 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.036951 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:15:17.037278 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.037256 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"b085f6ac990ba844694f1928588201f4e47ba0ecd621b5ba3d514ff12f2e50b3"} Apr 22 14:15:17.037352 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.037285 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"6d2a512632fea195e2eb782194dce20a73a5d22088015c6db0687011fcb28f17"} Apr 22 14:15:17.038453 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.038426 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fwhsq" event={"ID":"cc6398d2-2767-495e-b2e8-7f3f713e5a31","Type":"ContainerStarted","Data":"5ee705ec42c4644339077319c46953fc782d05fd0672e314fa1b1464bbf64e2e"} Apr 22 14:15:17.039735 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.039717 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" event={"ID":"0212ebfc-c697-40e1-8939-863a200bf32a","Type":"ContainerStarted","Data":"a2af202f38a3a0fdc69d0c3e215ab793d0ac77cd5b93b0869ed5ea87baba283e"} Apr 22 14:15:17.047333 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.047295 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9t7jt" podStartSLOduration=11.094287287 podStartE2EDuration="20.047281435s" podCreationTimestamp="2026-04-22 14:14:57 +0000 UTC" firstStartedPulling="2026-04-22 14:14:59.460952282 +0000 UTC m=+3.160312943" lastFinishedPulling="2026-04-22 14:15:08.413946426 +0000 UTC m=+12.113307091" observedRunningTime="2026-04-22 14:15:17.046784332 +0000 UTC m=+20.746145021" watchObservedRunningTime="2026-04-22 14:15:17.047281435 +0000 UTC m=+20.746642118" Apr 22 14:15:17.047445 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.047380 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-31.ec2.internal" podStartSLOduration=19.047372217 podStartE2EDuration="19.047372217s" podCreationTimestamp="2026-04-22 14:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:02.020956033 +0000 UTC m=+5.720316718" watchObservedRunningTime="2026-04-22 14:15:17.047372217 +0000 UTC m=+20.746732899" Apr 22 14:15:17.091600 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.091404 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fwhsq" podStartSLOduration=4.001746993 podStartE2EDuration="21.091381724s" podCreationTimestamp="2026-04-22 14:14:56 +0000 UTC" firstStartedPulling="2026-04-22 14:14:59.484966859 +0000 UTC m=+3.184327523" lastFinishedPulling="2026-04-22 14:15:16.57460158 +0000 UTC m=+20.273962254" observedRunningTime="2026-04-22 14:15:17.090671044 +0000 UTC m=+20.790031721" watchObservedRunningTime="2026-04-22 14:15:17.091381724 +0000 UTC m=+20.790742408" Apr 22 14:15:17.092032 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.091878 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jf64f" podStartSLOduration=4.010039698 podStartE2EDuration="21.091865193s" podCreationTimestamp="2026-04-22 14:14:56 +0000 UTC" firstStartedPulling="2026-04-22 14:14:59.492082611 +0000 UTC m=+3.191443284" lastFinishedPulling="2026-04-22 14:15:16.57390811 +0000 UTC m=+20.273268779" observedRunningTime="2026-04-22 14:15:17.069399688 +0000 UTC m=+20.768760370" watchObservedRunningTime="2026-04-22 14:15:17.091865193 +0000 UTC m=+20.791225875" Apr 22 14:15:17.112039 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.111996 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8tc5r" podStartSLOduration=2.982354167 podStartE2EDuration="20.111981701s" podCreationTimestamp="2026-04-22 14:14:57 +0000 UTC" firstStartedPulling="2026-04-22 14:14:59.459492035 +0000 UTC m=+3.158852710" lastFinishedPulling="2026-04-22 14:15:16.58911958 +0000 UTC m=+20.288480244" observedRunningTime="2026-04-22 14:15:17.111389566 +0000 UTC m=+20.810750248" watchObservedRunningTime="2026-04-22 14:15:17.111981701 +0000 UTC m=+20.811342382" Apr 22 14:15:17.128608 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.128552 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ghlf9" podStartSLOduration=3.01607826 podStartE2EDuration="20.128533223s" podCreationTimestamp="2026-04-22 14:14:57 +0000 UTC" firstStartedPulling="2026-04-22 14:14:59.46367646 +0000 UTC m=+3.163037131" lastFinishedPulling="2026-04-22 14:15:16.576131419 +0000 UTC m=+20.275492094" observedRunningTime="2026-04-22 14:15:17.12851884 +0000 UTC m=+20.827879521" watchObservedRunningTime="2026-04-22 14:15:17.128533223 +0000 UTC m=+20.827893907" Apr 22 14:15:17.920597 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.920564 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:17.920735 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:17.920678 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:17.936721 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:17.936690 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:15:18.043224 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.043188 2562 generic.go:358] "Generic (PLEG): container finished" podID="0212ebfc-c697-40e1-8939-863a200bf32a" containerID="a2af202f38a3a0fdc69d0c3e215ab793d0ac77cd5b93b0869ed5ea87baba283e" exitCode=0 Apr 22 14:15:18.043356 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.043268 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" event={"ID":"0212ebfc-c697-40e1-8939-863a200bf32a","Type":"ContainerDied","Data":"a2af202f38a3a0fdc69d0c3e215ab793d0ac77cd5b93b0869ed5ea87baba283e"} Apr 22 14:15:18.044925 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.044856 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" event={"ID":"63b3e3a3-eb48-431a-a1ca-d455b05ea91c","Type":"ContainerStarted","Data":"68c01b2b96a67720a71cf55499702d84d9c3ebbe88c4be3a54c50c53033df0cc"} Apr 22 14:15:18.046097 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.046072 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z56xg" event={"ID":"ad3bd840-a967-40e3-9669-959790f9dfb8","Type":"ContainerStarted","Data":"52d955148bddb648effa412454fccceb5551012edb876feb003e53b3bcd6f092"} Apr 22 14:15:18.048466 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.048448 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:15:18.048822 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.048801 2562 generic.go:358] "Generic (PLEG): container finished" podID="524b05a6-377c-460c-a38e-359a1d04f304" containerID="b085f6ac990ba844694f1928588201f4e47ba0ecd621b5ba3d514ff12f2e50b3" exitCode=1 Apr 22 14:15:18.048912 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.048884 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerDied","Data":"b085f6ac990ba844694f1928588201f4e47ba0ecd621b5ba3d514ff12f2e50b3"} Apr 22 14:15:18.048976 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.048912 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"9578f08ed6183699f01444fb6638b97484318f97c571678232218fdeb2feb2e0"} Apr 22 14:15:18.048976 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.048923 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"ad9b97759d5ea91492885ff7725ad6d3ca3c8c7875b527e92acdb518304d8ba5"} Apr 22 14:15:18.048976 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.048931 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"5d41ef088f42eab919e424f456f9bd016f1824213150973aaacc5e7638d3713f"} Apr 22 14:15:18.048976 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.048941 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"33197c4da881fa91fe9424f4d171d5ef4c1b487649e1c0b41dfb7e65647646c4"} Apr 22 14:15:18.084524 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.084479 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-z56xg" podStartSLOduration=4.002446918 podStartE2EDuration="21.084464558s" podCreationTimestamp="2026-04-22 14:14:57 +0000 UTC" firstStartedPulling="2026-04-22 14:14:59.492161336 +0000 UTC m=+3.191522012" lastFinishedPulling="2026-04-22 14:15:16.574178992 +0000 UTC m=+20.273539652" observedRunningTime="2026-04-22 14:15:18.084083308 +0000 UTC m=+21.783443989" watchObservedRunningTime="2026-04-22 14:15:18.084464558 +0000 UTC m=+21.783825240" Apr 22 14:15:18.870737 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.870447 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:15:17.936706072Z","UUID":"701f48de-aa49-4112-8a6f-33aa3dbc3059","Handler":null,"Name":"","Endpoint":""} Apr 22 14:15:18.872776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.872745 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:15:18.872927 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.872787 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:15:18.920872 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.920816 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:18.921007 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:18.920962 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:18.921377 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:18.921360 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:18.921472 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:18.921455 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:19.053037 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:19.052781 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" event={"ID":"63b3e3a3-eb48-431a-a1ca-d455b05ea91c","Type":"ContainerStarted","Data":"6e2f1ff4fc9006f1dff3bfcb749854a81962639372f1e75c2fec5d117dcfabb6"} Apr 22 14:15:19.091714 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:19.091661 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qxwc5" podStartSLOduration=2.68537896 podStartE2EDuration="22.091633927s" podCreationTimestamp="2026-04-22 14:14:57 +0000 UTC" firstStartedPulling="2026-04-22 14:14:59.464170973 +0000 UTC m=+3.163531636" lastFinishedPulling="2026-04-22 14:15:18.870425924 +0000 UTC m=+22.569786603" observedRunningTime="2026-04-22 14:15:19.090936123 +0000 UTC m=+22.790296809" watchObservedRunningTime="2026-04-22 14:15:19.091633927 +0000 UTC m=+22.790994611" Apr 22 14:15:19.920824 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:19.920741 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:19.920995 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:19.920860 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:20.058273 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:20.058243 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:15:20.058865 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:20.058840 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"35708b4d869410a7aac7c58ff1da81995d26f15e9dd36fde3ff715006d65825b"} Apr 22 14:15:20.920838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:20.920803 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:20.920991 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:20.920803 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:20.920991 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:20.920909 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:20.921083 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:20.921021 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:21.003387 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:21.003338 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:15:21.004060 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:21.004031 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:15:21.060565 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:21.060488 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:15:21.061217 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:21.061002 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fwhsq" Apr 22 14:15:21.920810 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:21.920781 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:21.920975 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:21.920895 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:22.921390 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:22.921216 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:22.921870 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:22.921227 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:22.921870 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:22.921468 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:22.921870 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:22.921524 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:23.065018 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:23.064984 2562 generic.go:358] "Generic (PLEG): container finished" podID="0212ebfc-c697-40e1-8939-863a200bf32a" containerID="4c1766b65fc6b1127ea70083e21ba4740b935d82f2a7fee5d4481e4ede2e4ec8" exitCode=0 Apr 22 14:15:23.065186 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:23.065066 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" event={"ID":"0212ebfc-c697-40e1-8939-863a200bf32a","Type":"ContainerDied","Data":"4c1766b65fc6b1127ea70083e21ba4740b935d82f2a7fee5d4481e4ede2e4ec8"} Apr 22 14:15:23.068117 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:23.068097 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:15:23.068397 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:23.068379 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"4c0fdf1af4cb0dd4477374fe53ce6ff3e8b1a052041abf40afff1856f31916a3"} Apr 22 14:15:23.068670 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:23.068633 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:15:23.068751 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:23.068674 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:15:23.068834 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:23.068818 2562 scope.go:117] "RemoveContainer" containerID="b085f6ac990ba844694f1928588201f4e47ba0ecd621b5ba3d514ff12f2e50b3" Apr 22 14:15:23.084776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:23.084753 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:15:23.921032 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:23.920862 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:23.921158 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:23.921111 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:24.068909 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.068796 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8kfb7"] Apr 22 14:15:24.069465 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.068943 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:24.069465 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:24.069052 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:24.069730 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.069709 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kcb7w"] Apr 22 14:15:24.072156 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.072132 2562 generic.go:358] "Generic (PLEG): container finished" podID="0212ebfc-c697-40e1-8939-863a200bf32a" containerID="2405717d7a2877bbc184788f0469d940fc2e17a83f239c7a730f141e22272444" exitCode=0 Apr 22 14:15:24.072258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.072216 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" event={"ID":"0212ebfc-c697-40e1-8939-863a200bf32a","Type":"ContainerDied","Data":"2405717d7a2877bbc184788f0469d940fc2e17a83f239c7a730f141e22272444"} Apr 22 14:15:24.075501 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.075485 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:15:24.075881 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.075861 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" event={"ID":"524b05a6-377c-460c-a38e-359a1d04f304","Type":"ContainerStarted","Data":"1e393a0286add6fcb75ea65ceb38dc7a5acd68c1a5c02f7bc9638b1889715e12"} Apr 22 14:15:24.075967 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.075890 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:24.076007 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:24.075964 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:24.076186 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.076172 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:15:24.081489 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.081468 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7pz2p"] Apr 22 14:15:24.081598 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.081582 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:24.081708 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:24.081690 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:24.091642 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.091623 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:15:24.124378 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:24.124329 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" podStartSLOduration=10.986455999 podStartE2EDuration="28.124315594s" podCreationTimestamp="2026-04-22 14:14:56 +0000 UTC" firstStartedPulling="2026-04-22 14:14:59.484963217 +0000 UTC m=+3.184323891" lastFinishedPulling="2026-04-22 14:15:16.62282281 +0000 UTC m=+20.322183486" observedRunningTime="2026-04-22 14:15:24.122977577 +0000 UTC m=+27.822338255" watchObservedRunningTime="2026-04-22 14:15:24.124315594 +0000 UTC m=+27.823676275" Apr 22 14:15:25.079520 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:25.079473 2562 generic.go:358] "Generic (PLEG): container finished" podID="0212ebfc-c697-40e1-8939-863a200bf32a" containerID="137e369089addd42330bb0f3faea737ce4e6aff6fcb1ddafdfc779378fa21a74" exitCode=0 Apr 22 14:15:25.080047 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:25.079550 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" event={"ID":"0212ebfc-c697-40e1-8939-863a200bf32a","Type":"ContainerDied","Data":"137e369089addd42330bb0f3faea737ce4e6aff6fcb1ddafdfc779378fa21a74"} Apr 22 14:15:25.920832 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:25.920792 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:25.921008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:25.920792 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:25.921008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:25.920809 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:25.921008 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:25.920929 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:25.921161 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:25.921083 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:25.921214 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:25.921178 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:27.920532 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:27.920502 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:27.921254 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:27.920510 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:27.921254 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:27.920629 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8kfb7" podUID="58472e4a-b808-4034-b26a-9ab40a3074ec" Apr 22 14:15:27.921254 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:27.920718 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kcb7w" podUID="1b5088f9-42ba-4937-95a9-db3577261f8f" Apr 22 14:15:27.921254 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:27.920745 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:27.921254 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:27.920849 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:15:29.617898 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.617827 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-31.ec2.internal" event="NodeReady" Apr 22 14:15:29.618292 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.618004 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:15:29.667053 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.667020 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7j877"] Apr 22 14:15:29.672042 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.672017 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-frhw5"] Apr 22 14:15:29.672194 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.672171 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.674964 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.674934 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:15:29.675135 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.674938 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hhqr7\"" Apr 22 14:15:29.675135 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.675073 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:15:29.675506 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.675490 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:29.677904 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.677886 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:15:29.678156 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.678137 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qh95w\"" Apr 22 14:15:29.678156 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.678150 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:15:29.678320 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.678157 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:15:29.682031 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.682005 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7j877"] Apr 22 14:15:29.685268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.685245 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-frhw5"] Apr 22 14:15:29.759293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.759256 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f2964c4-19d3-4dcc-b821-38a683bc38f7-config-volume\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.759467 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.759305 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.759467 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.759335 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:29.759467 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.759357 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h86b\" (UniqueName: \"kubernetes.io/projected/59e79879-c532-4a00-a584-9f807448ef98-kube-api-access-5h86b\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:29.759467 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.759400 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3f2964c4-19d3-4dcc-b821-38a683bc38f7-tmp-dir\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.759467 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.759454 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8klz\" (UniqueName: \"kubernetes.io/projected/3f2964c4-19d3-4dcc-b821-38a683bc38f7-kube-api-access-g8klz\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.859847 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.859811 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8klz\" (UniqueName: \"kubernetes.io/projected/3f2964c4-19d3-4dcc-b821-38a683bc38f7-kube-api-access-g8klz\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.860025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.859883 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f2964c4-19d3-4dcc-b821-38a683bc38f7-config-volume\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.860025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.859907 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.860025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.859930 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:29.860025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.859949 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h86b\" (UniqueName: \"kubernetes.io/projected/59e79879-c532-4a00-a584-9f807448ef98-kube-api-access-5h86b\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:29.860230 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:29.860046 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:29.860230 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:29.860073 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:29.860230 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:29.860119 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls podName:3f2964c4-19d3-4dcc-b821-38a683bc38f7 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:30.360096443 +0000 UTC m=+34.059457116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls") pod "dns-default-7j877" (UID: "3f2964c4-19d3-4dcc-b821-38a683bc38f7") : secret "dns-default-metrics-tls" not found Apr 22 14:15:29.860230 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:29.860137 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert podName:59e79879-c532-4a00-a584-9f807448ef98 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:30.360127631 +0000 UTC m=+34.059488292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert") pod "ingress-canary-frhw5" (UID: "59e79879-c532-4a00-a584-9f807448ef98") : secret "canary-serving-cert" not found Apr 22 14:15:29.860230 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.860173 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3f2964c4-19d3-4dcc-b821-38a683bc38f7-tmp-dir\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.860469 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.860428 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3f2964c4-19d3-4dcc-b821-38a683bc38f7-tmp-dir\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.860509 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.860498 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f2964c4-19d3-4dcc-b821-38a683bc38f7-config-volume\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.871282 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.871081 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8klz\" (UniqueName: \"kubernetes.io/projected/3f2964c4-19d3-4dcc-b821-38a683bc38f7-kube-api-access-g8klz\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:29.871282 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.871097 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h86b\" (UniqueName: \"kubernetes.io/projected/59e79879-c532-4a00-a584-9f807448ef98-kube-api-access-5h86b\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:29.920583 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.920545 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:29.920779 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.920545 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:29.920779 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.920545 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:29.923965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.923942 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:15:29.923965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.923958 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:15:29.924155 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.924051 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:15:29.924309 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.924293 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-95blp\"" Apr 22 14:15:29.924383 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.924340 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:15:29.924383 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:29.924356 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4nhsh\"" Apr 22 14:15:30.363953 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:30.363920 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:30.363953 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:30.363968 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:30.364202 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:30.364087 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:30.364202 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:30.364155 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls podName:3f2964c4-19d3-4dcc-b821-38a683bc38f7 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:31.364133902 +0000 UTC m=+35.063494562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls") pod "dns-default-7j877" (UID: "3f2964c4-19d3-4dcc-b821-38a683bc38f7") : secret "dns-default-metrics-tls" not found Apr 22 14:15:30.364202 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:30.364092 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:30.364369 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:30.364258 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert podName:59e79879-c532-4a00-a584-9f807448ef98 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:31.364240057 +0000 UTC m=+35.063600720 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert") pod "ingress-canary-frhw5" (UID: "59e79879-c532-4a00-a584-9f807448ef98") : secret "canary-serving-cert" not found Apr 22 14:15:30.667121 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:30.667044 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:15:30.667458 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:30.667196 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:15:30.667458 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:30.667273 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs podName:db11d8cb-718e-49f4-a019-bc36f8a9af79 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:02.667256225 +0000 UTC m=+66.366616890 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs") pod "network-metrics-daemon-7pz2p" (UID: "db11d8cb-718e-49f4-a019-bc36f8a9af79") : secret "metrics-daemon-secret" not found Apr 22 14:15:30.768216 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:30.768175 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkt6\" (UniqueName: \"kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6\") pod \"network-check-target-8kfb7\" (UID: \"58472e4a-b808-4034-b26a-9ab40a3074ec\") " pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:30.771133 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:30.771108 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxkt6\" (UniqueName: \"kubernetes.io/projected/58472e4a-b808-4034-b26a-9ab40a3074ec-kube-api-access-zxkt6\") pod \"network-check-target-8kfb7\" (UID: \"58472e4a-b808-4034-b26a-9ab40a3074ec\") " pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:30.845856 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:30.845822 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:31.139162 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:31.139129 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8kfb7"] Apr 22 14:15:31.142943 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:15:31.142910 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58472e4a_b808_4034_b26a_9ab40a3074ec.slice/crio-feb8f05719b7b9d35f790827fc08c6c6b46f397baec31dbaaf7e43312d382ff1 WatchSource:0}: Error finding container feb8f05719b7b9d35f790827fc08c6c6b46f397baec31dbaaf7e43312d382ff1: Status 404 returned error can't find the container with id feb8f05719b7b9d35f790827fc08c6c6b46f397baec31dbaaf7e43312d382ff1 Apr 22 14:15:31.372369 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:31.372327 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:31.372560 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:31.372434 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:31.372560 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:31.372471 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:31.372560 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:31.372533 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:31.372714 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:31.372536 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert podName:59e79879-c532-4a00-a584-9f807448ef98 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.37252056 +0000 UTC m=+37.071881225 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert") pod "ingress-canary-frhw5" (UID: "59e79879-c532-4a00-a584-9f807448ef98") : secret "canary-serving-cert" not found Apr 22 14:15:31.372714 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:31.372594 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls podName:3f2964c4-19d3-4dcc-b821-38a683bc38f7 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.372577998 +0000 UTC m=+37.071938657 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls") pod "dns-default-7j877" (UID: "3f2964c4-19d3-4dcc-b821-38a683bc38f7") : secret "dns-default-metrics-tls" not found Apr 22 14:15:31.977078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:31.977039 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:31.992281 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:31.992247 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1b5088f9-42ba-4937-95a9-db3577261f8f-original-pull-secret\") pod \"global-pull-secret-syncer-kcb7w\" (UID: \"1b5088f9-42ba-4937-95a9-db3577261f8f\") " pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:32.041251 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:32.040958 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kcb7w" Apr 22 14:15:32.099965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:32.098965 2562 generic.go:358] "Generic (PLEG): container finished" podID="0212ebfc-c697-40e1-8939-863a200bf32a" containerID="f36ff5e436f6904d96089bdaab086b1ec5785703d04af8334de926c3dd081ad4" exitCode=0 Apr 22 14:15:32.099965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:32.099059 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" event={"ID":"0212ebfc-c697-40e1-8939-863a200bf32a","Type":"ContainerDied","Data":"f36ff5e436f6904d96089bdaab086b1ec5785703d04af8334de926c3dd081ad4"} Apr 22 14:15:32.101462 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:32.101432 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8kfb7" event={"ID":"58472e4a-b808-4034-b26a-9ab40a3074ec","Type":"ContainerStarted","Data":"feb8f05719b7b9d35f790827fc08c6c6b46f397baec31dbaaf7e43312d382ff1"} Apr 22 14:15:32.179249 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:32.179213 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kcb7w"] Apr 22 14:15:32.183926 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:15:32.183888 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b5088f9_42ba_4937_95a9_db3577261f8f.slice/crio-0c0114b9d86af345df6e0d6fb1dd7117c49bbb87b5b8b6a25f20ae2c98711eb5 WatchSource:0}: Error finding container 0c0114b9d86af345df6e0d6fb1dd7117c49bbb87b5b8b6a25f20ae2c98711eb5: Status 404 returned error can't find the container with id 0c0114b9d86af345df6e0d6fb1dd7117c49bbb87b5b8b6a25f20ae2c98711eb5 Apr 22 14:15:33.105048 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:33.105011 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kcb7w" event={"ID":"1b5088f9-42ba-4937-95a9-db3577261f8f","Type":"ContainerStarted","Data":"0c0114b9d86af345df6e0d6fb1dd7117c49bbb87b5b8b6a25f20ae2c98711eb5"} Apr 22 14:15:33.108283 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:33.108187 2562 generic.go:358] "Generic (PLEG): container finished" podID="0212ebfc-c697-40e1-8939-863a200bf32a" containerID="91d65a564acb0565bc6b2d7b4b8c78a2a3497e082653ac53eda9a3f49704780f" exitCode=0 Apr 22 14:15:33.108283 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:33.108249 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" event={"ID":"0212ebfc-c697-40e1-8939-863a200bf32a","Type":"ContainerDied","Data":"91d65a564acb0565bc6b2d7b4b8c78a2a3497e082653ac53eda9a3f49704780f"} Apr 22 14:15:33.390987 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:33.390944 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:33.390987 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:33.391002 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:33.391204 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:33.391127 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:33.391204 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:33.391188 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert podName:59e79879-c532-4a00-a584-9f807448ef98 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:37.391168225 +0000 UTC m=+41.090528891 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert") pod "ingress-canary-frhw5" (UID: "59e79879-c532-4a00-a584-9f807448ef98") : secret "canary-serving-cert" not found Apr 22 14:15:33.391401 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:33.391373 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:33.391534 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:33.391442 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls podName:3f2964c4-19d3-4dcc-b821-38a683bc38f7 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:37.391421578 +0000 UTC m=+41.090782241 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls") pod "dns-default-7j877" (UID: "3f2964c4-19d3-4dcc-b821-38a683bc38f7") : secret "dns-default-metrics-tls" not found Apr 22 14:15:35.116665 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:35.116594 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" event={"ID":"0212ebfc-c697-40e1-8939-863a200bf32a","Type":"ContainerStarted","Data":"d4c08285ffa37f467aacdede53e46e81fd756296a783940f83a2245449ad3a5c"} Apr 22 14:15:35.118257 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:35.118224 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8kfb7" event={"ID":"58472e4a-b808-4034-b26a-9ab40a3074ec","Type":"ContainerStarted","Data":"29929db185cfff8d9c0a456b02600975abb6ff3e68cb11922a1c7e254ac844e1"} Apr 22 14:15:35.118398 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:35.118369 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:15:35.143323 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:35.143268 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sl5cl" podStartSLOduration=6.650632378 podStartE2EDuration="38.143250423s" podCreationTimestamp="2026-04-22 14:14:57 +0000 UTC" firstStartedPulling="2026-04-22 14:14:59.487598674 +0000 UTC m=+3.186959334" lastFinishedPulling="2026-04-22 14:15:30.980216705 +0000 UTC m=+34.679577379" observedRunningTime="2026-04-22 14:15:35.141799352 +0000 UTC m=+38.841160036" watchObservedRunningTime="2026-04-22 14:15:35.143250423 +0000 UTC m=+38.842611104" Apr 22 14:15:35.159331 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:35.159279 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-8kfb7" podStartSLOduration=34.984321639 podStartE2EDuration="38.159265787s" podCreationTimestamp="2026-04-22 14:14:57 +0000 UTC" firstStartedPulling="2026-04-22 14:15:31.145023829 +0000 UTC m=+34.844384492" lastFinishedPulling="2026-04-22 14:15:34.319967965 +0000 UTC m=+38.019328640" observedRunningTime="2026-04-22 14:15:35.158524796 +0000 UTC m=+38.857885482" watchObservedRunningTime="2026-04-22 14:15:35.159265787 +0000 UTC m=+38.858626461" Apr 22 14:15:37.123208 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:37.123170 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kcb7w" event={"ID":"1b5088f9-42ba-4937-95a9-db3577261f8f","Type":"ContainerStarted","Data":"4b05e5c8e7f9bdc2df861ebb372839becb634dfb0898e849fab80d8f90b95c58"} Apr 22 14:15:37.139233 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:37.139183 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kcb7w" podStartSLOduration=33.078760468 podStartE2EDuration="37.139168991s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.18646543 +0000 UTC m=+35.885826091" lastFinishedPulling="2026-04-22 14:15:36.246873941 +0000 UTC m=+39.946234614" observedRunningTime="2026-04-22 14:15:37.138221649 +0000 UTC m=+40.837582330" watchObservedRunningTime="2026-04-22 14:15:37.139168991 +0000 UTC m=+40.838529673" Apr 22 14:15:37.420530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:37.420439 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:37.420530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:37.420484 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:37.420772 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:37.420573 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:37.420772 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:37.420598 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:37.420772 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:37.420636 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls podName:3f2964c4-19d3-4dcc-b821-38a683bc38f7 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:45.420618718 +0000 UTC m=+49.119979377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls") pod "dns-default-7j877" (UID: "3f2964c4-19d3-4dcc-b821-38a683bc38f7") : secret "dns-default-metrics-tls" not found Apr 22 14:15:37.420772 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:37.420668 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert podName:59e79879-c532-4a00-a584-9f807448ef98 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:45.420644057 +0000 UTC m=+49.120004717 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert") pod "ingress-canary-frhw5" (UID: "59e79879-c532-4a00-a584-9f807448ef98") : secret "canary-serving-cert" not found Apr 22 14:15:45.475577 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:45.475529 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:15:45.476035 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:45.475629 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:15:45.476035 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:45.475700 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:45.476035 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:45.475768 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:45.476035 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:45.475802 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert podName:59e79879-c532-4a00-a584-9f807448ef98 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:01.475786515 +0000 UTC m=+65.175147175 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert") pod "ingress-canary-frhw5" (UID: "59e79879-c532-4a00-a584-9f807448ef98") : secret "canary-serving-cert" not found Apr 22 14:15:45.476035 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:15:45.475823 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls podName:3f2964c4-19d3-4dcc-b821-38a683bc38f7 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:01.475808998 +0000 UTC m=+65.175169658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls") pod "dns-default-7j877" (UID: "3f2964c4-19d3-4dcc-b821-38a683bc38f7") : secret "dns-default-metrics-tls" not found Apr 22 14:15:46.685863 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.685825 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg"] Apr 22 14:15:46.724865 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.724829 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67"] Apr 22 14:15:46.725035 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.724872 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.727794 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.727772 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 14:15:46.727951 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.727772 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 14:15:46.727951 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.727804 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 14:15:46.727951 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.727773 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 14:15:46.751283 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.751256 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg"] Apr 22 14:15:46.751283 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.751287 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67"] Apr 22 14:15:46.751483 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.751403 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.754640 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.754608 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 14:15:46.755069 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.755049 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 14:15:46.755256 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.755225 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 14:15:46.755356 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.755233 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 14:15:46.783884 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.783859 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-ca\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.783999 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.783891 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.783999 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.783924 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.783999 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.783944 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c063fba5-355f-415e-a445-6b6f66bb8213-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.784103 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.784053 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bffdf534-ddd9-465a-b5a1-fb1da38b61d4-klusterlet-config\") pod \"klusterlet-addon-workmgr-86f5bddf6b-mvzpg\" (UID: \"bffdf534-ddd9-465a-b5a1-fb1da38b61d4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.784103 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.784076 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddplk\" (UniqueName: \"kubernetes.io/projected/bffdf534-ddd9-465a-b5a1-fb1da38b61d4-kube-api-access-ddplk\") pod \"klusterlet-addon-workmgr-86f5bddf6b-mvzpg\" (UID: \"bffdf534-ddd9-465a-b5a1-fb1da38b61d4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.784103 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.784097 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bffdf534-ddd9-465a-b5a1-fb1da38b61d4-tmp\") pod \"klusterlet-addon-workmgr-86f5bddf6b-mvzpg\" (UID: \"bffdf534-ddd9-465a-b5a1-fb1da38b61d4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.784194 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.784170 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mflnw\" (UniqueName: \"kubernetes.io/projected/c063fba5-355f-415e-a445-6b6f66bb8213-kube-api-access-mflnw\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.784225 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.784196 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-hub\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.885441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.885410 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.885441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.885444 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c063fba5-355f-415e-a445-6b6f66bb8213-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.885641 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.885581 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bffdf534-ddd9-465a-b5a1-fb1da38b61d4-klusterlet-config\") pod \"klusterlet-addon-workmgr-86f5bddf6b-mvzpg\" (UID: \"bffdf534-ddd9-465a-b5a1-fb1da38b61d4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.885641 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.885612 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddplk\" (UniqueName: \"kubernetes.io/projected/bffdf534-ddd9-465a-b5a1-fb1da38b61d4-kube-api-access-ddplk\") pod \"klusterlet-addon-workmgr-86f5bddf6b-mvzpg\" (UID: \"bffdf534-ddd9-465a-b5a1-fb1da38b61d4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.885736 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.885666 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bffdf534-ddd9-465a-b5a1-fb1da38b61d4-tmp\") pod \"klusterlet-addon-workmgr-86f5bddf6b-mvzpg\" (UID: \"bffdf534-ddd9-465a-b5a1-fb1da38b61d4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.885736 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.885704 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mflnw\" (UniqueName: \"kubernetes.io/projected/c063fba5-355f-415e-a445-6b6f66bb8213-kube-api-access-mflnw\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.885831 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.885734 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-hub\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.885831 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.885765 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-ca\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.885831 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.885795 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.886228 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.886201 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c063fba5-355f-415e-a445-6b6f66bb8213-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.886478 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.886456 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bffdf534-ddd9-465a-b5a1-fb1da38b61d4-tmp\") pod \"klusterlet-addon-workmgr-86f5bddf6b-mvzpg\" (UID: \"bffdf534-ddd9-465a-b5a1-fb1da38b61d4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.889098 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.889074 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bffdf534-ddd9-465a-b5a1-fb1da38b61d4-klusterlet-config\") pod \"klusterlet-addon-workmgr-86f5bddf6b-mvzpg\" (UID: \"bffdf534-ddd9-465a-b5a1-fb1da38b61d4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.889238 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.889220 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-ca\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.889278 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.889251 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-hub\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.889467 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.889450 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.889789 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.889770 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c063fba5-355f-415e-a445-6b6f66bb8213-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:46.894831 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.894811 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddplk\" (UniqueName: \"kubernetes.io/projected/bffdf534-ddd9-465a-b5a1-fb1da38b61d4-kube-api-access-ddplk\") pod \"klusterlet-addon-workmgr-86f5bddf6b-mvzpg\" (UID: \"bffdf534-ddd9-465a-b5a1-fb1da38b61d4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:46.894923 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:46.894856 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mflnw\" (UniqueName: \"kubernetes.io/projected/c063fba5-355f-415e-a445-6b6f66bb8213-kube-api-access-mflnw\") pod \"cluster-proxy-proxy-agent-7db4dbf7db-v8m67\" (UID: \"c063fba5-355f-415e-a445-6b6f66bb8213\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:47.043697 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:47.043586 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:47.070375 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:47.070337 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:15:47.169052 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:47.169023 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg"] Apr 22 14:15:47.170967 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:15:47.170938 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbffdf534_ddd9_465a_b5a1_fb1da38b61d4.slice/crio-8c337ace2adee47dba078c66ec9070c236bfebe1fd2a4cdd4efa7fea73a9ab72 WatchSource:0}: Error finding container 8c337ace2adee47dba078c66ec9070c236bfebe1fd2a4cdd4efa7fea73a9ab72: Status 404 returned error can't find the container with id 8c337ace2adee47dba078c66ec9070c236bfebe1fd2a4cdd4efa7fea73a9ab72 Apr 22 14:15:47.200342 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:47.200314 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67"] Apr 22 14:15:47.203003 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:15:47.202973 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc063fba5_355f_415e_a445_6b6f66bb8213.slice/crio-57c0d151ee63152866dab0501a249c115be61c8a9c36f0081128fdbdf17d04d9 WatchSource:0}: Error finding container 57c0d151ee63152866dab0501a249c115be61c8a9c36f0081128fdbdf17d04d9: Status 404 returned error can't find the container with id 57c0d151ee63152866dab0501a249c115be61c8a9c36f0081128fdbdf17d04d9 Apr 22 14:15:48.149179 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:48.149138 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" event={"ID":"bffdf534-ddd9-465a-b5a1-fb1da38b61d4","Type":"ContainerStarted","Data":"8c337ace2adee47dba078c66ec9070c236bfebe1fd2a4cdd4efa7fea73a9ab72"} Apr 22 14:15:48.150139 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:48.150103 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" event={"ID":"c063fba5-355f-415e-a445-6b6f66bb8213","Type":"ContainerStarted","Data":"57c0d151ee63152866dab0501a249c115be61c8a9c36f0081128fdbdf17d04d9"} Apr 22 14:15:53.161733 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:53.161693 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" event={"ID":"c063fba5-355f-415e-a445-6b6f66bb8213","Type":"ContainerStarted","Data":"05aa426b30867c79748be5095b3d8c37b5ff709ca5f772dbd73890a82e928e02"} Apr 22 14:15:53.162882 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:53.162852 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" event={"ID":"bffdf534-ddd9-465a-b5a1-fb1da38b61d4","Type":"ContainerStarted","Data":"95ed59dcdb6f6c5d879b53b3f546bcb717e924e6605c062162c2b930bcfac5f6"} Apr 22 14:15:53.163098 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:53.163079 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:53.164784 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:53.164764 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:15:53.182695 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:53.180239 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" podStartSLOduration=1.678150981 podStartE2EDuration="7.18020625s" podCreationTimestamp="2026-04-22 14:15:46 +0000 UTC" firstStartedPulling="2026-04-22 14:15:47.172751965 +0000 UTC m=+50.872112624" lastFinishedPulling="2026-04-22 14:15:52.674807219 +0000 UTC m=+56.374167893" observedRunningTime="2026-04-22 14:15:53.180191287 +0000 UTC m=+56.879551970" watchObservedRunningTime="2026-04-22 14:15:53.18020625 +0000 UTC m=+56.879566933" Apr 22 14:15:55.170185 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:55.170103 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" event={"ID":"c063fba5-355f-415e-a445-6b6f66bb8213","Type":"ContainerStarted","Data":"2b1b6caca80282a3b87107498ff38e653e4d19b1fbf7a42d18952c20eeb479ca"} Apr 22 14:15:55.170185 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:55.170138 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" event={"ID":"c063fba5-355f-415e-a445-6b6f66bb8213","Type":"ContainerStarted","Data":"655d36e1d7b212cb594370b70489326b7312f79665c0cc27794ee8f7f632e334"} Apr 22 14:15:55.191083 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:55.191025 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" podStartSLOduration=1.49624593 podStartE2EDuration="9.19101098s" podCreationTimestamp="2026-04-22 14:15:46 +0000 UTC" firstStartedPulling="2026-04-22 14:15:47.204812704 +0000 UTC m=+50.904173364" lastFinishedPulling="2026-04-22 14:15:54.899577752 +0000 UTC m=+58.598938414" observedRunningTime="2026-04-22 14:15:55.190229166 +0000 UTC m=+58.889589849" watchObservedRunningTime="2026-04-22 14:15:55.19101098 +0000 UTC m=+58.890371662" Apr 22 14:15:56.092343 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:15:56.092310 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k777w" Apr 22 14:16:01.486955 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:16:01.486899 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:16:01.486955 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:16:01.486957 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:16:01.487386 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:01.487053 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:01.487386 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:01.487054 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:01.487386 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:01.487101 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert podName:59e79879-c532-4a00-a584-9f807448ef98 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:33.487087117 +0000 UTC m=+97.186447777 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert") pod "ingress-canary-frhw5" (UID: "59e79879-c532-4a00-a584-9f807448ef98") : secret "canary-serving-cert" not found Apr 22 14:16:01.487386 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:01.487167 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls podName:3f2964c4-19d3-4dcc-b821-38a683bc38f7 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:33.487151737 +0000 UTC m=+97.186512396 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls") pod "dns-default-7j877" (UID: "3f2964c4-19d3-4dcc-b821-38a683bc38f7") : secret "dns-default-metrics-tls" not found Apr 22 14:16:02.695783 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:16:02.695746 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:16:02.696173 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:02.695896 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:02.696173 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:02.695968 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs podName:db11d8cb-718e-49f4-a019-bc36f8a9af79 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:06.695950994 +0000 UTC m=+130.395311654 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs") pod "network-metrics-daemon-7pz2p" (UID: "db11d8cb-718e-49f4-a019-bc36f8a9af79") : secret "metrics-daemon-secret" not found Apr 22 14:16:06.122374 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:16:06.122343 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-8kfb7" Apr 22 14:16:33.509985 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:16:33.509858 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:16:33.509985 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:16:33.509897 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:16:33.510488 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:33.509998 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:33.510488 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:33.510061 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls podName:3f2964c4-19d3-4dcc-b821-38a683bc38f7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:37.51004378 +0000 UTC m=+161.209404440 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls") pod "dns-default-7j877" (UID: "3f2964c4-19d3-4dcc-b821-38a683bc38f7") : secret "dns-default-metrics-tls" not found Apr 22 14:16:33.510488 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:33.510002 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:33.510488 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:16:33.510144 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert podName:59e79879-c532-4a00-a584-9f807448ef98 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:37.510128853 +0000 UTC m=+161.209489512 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert") pod "ingress-canary-frhw5" (UID: "59e79879-c532-4a00-a584-9f807448ef98") : secret "canary-serving-cert" not found Apr 22 14:17:06.741168 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:06.741114 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:17:06.741839 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:17:06.741292 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:17:06.741839 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:17:06.741399 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs podName:db11d8cb-718e-49f4-a019-bc36f8a9af79 nodeName:}" failed. No retries permitted until 2026-04-22 14:19:08.741376014 +0000 UTC m=+252.440736675 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs") pod "network-metrics-daemon-7pz2p" (UID: "db11d8cb-718e-49f4-a019-bc36f8a9af79") : secret "metrics-daemon-secret" not found Apr 22 14:17:26.210688 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:26.210645 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jf64f_a50e9092-d980-437f-925d-016de60cc559/dns-node-resolver/0.log" Apr 22 14:17:27.008926 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:27.008897 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9t7jt_1ec369a9-6fa7-4522-ab1c-257f1ae32b8d/node-ca/0.log" Apr 22 14:17:32.684338 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:17:32.684281 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7j877" podUID="3f2964c4-19d3-4dcc-b821-38a683bc38f7" Apr 22 14:17:32.691429 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:17:32.691391 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-frhw5" podUID="59e79879-c532-4a00-a584-9f807448ef98" Apr 22 14:17:32.933293 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:17:32.933257 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7pz2p" podUID="db11d8cb-718e-49f4-a019-bc36f8a9af79" Apr 22 14:17:33.391717 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:33.391686 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:17:33.391889 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:33.391691 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j877" Apr 22 14:17:37.575408 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.575361 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:17:37.575408 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.575402 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:17:37.577675 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.577621 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f2964c4-19d3-4dcc-b821-38a683bc38f7-metrics-tls\") pod \"dns-default-7j877\" (UID: \"3f2964c4-19d3-4dcc-b821-38a683bc38f7\") " pod="openshift-dns/dns-default-7j877" Apr 22 14:17:37.577804 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.577752 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59e79879-c532-4a00-a584-9f807448ef98-cert\") pod \"ingress-canary-frhw5\" (UID: \"59e79879-c532-4a00-a584-9f807448ef98\") " pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:17:37.595238 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.595219 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hhqr7\"" Apr 22 14:17:37.596403 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.596388 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qh95w\"" Apr 22 14:17:37.603505 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.603490 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-frhw5" Apr 22 14:17:37.603601 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.603585 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j877" Apr 22 14:17:37.727999 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.727835 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-frhw5"] Apr 22 14:17:37.730583 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:17:37.730555 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e79879_c532_4a00_a584_9f807448ef98.slice/crio-9766631f85b9a0b9346b86cc279500a7c1cfde6cc1619ddaca7a5faa833dcbca WatchSource:0}: Error finding container 9766631f85b9a0b9346b86cc279500a7c1cfde6cc1619ddaca7a5faa833dcbca: Status 404 returned error can't find the container with id 9766631f85b9a0b9346b86cc279500a7c1cfde6cc1619ddaca7a5faa833dcbca Apr 22 14:17:37.747510 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:37.747484 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7j877"] Apr 22 14:17:37.750188 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:17:37.750163 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f2964c4_19d3_4dcc_b821_38a683bc38f7.slice/crio-7c399048553c1357c668df34731279423dc4d358049e36e388554d4f99c89ffd WatchSource:0}: Error finding container 7c399048553c1357c668df34731279423dc4d358049e36e388554d4f99c89ffd: Status 404 returned error can't find the container with id 7c399048553c1357c668df34731279423dc4d358049e36e388554d4f99c89ffd Apr 22 14:17:38.405349 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:38.405302 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j877" event={"ID":"3f2964c4-19d3-4dcc-b821-38a683bc38f7","Type":"ContainerStarted","Data":"7c399048553c1357c668df34731279423dc4d358049e36e388554d4f99c89ffd"} Apr 22 14:17:38.406370 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:38.406329 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-frhw5" event={"ID":"59e79879-c532-4a00-a584-9f807448ef98","Type":"ContainerStarted","Data":"9766631f85b9a0b9346b86cc279500a7c1cfde6cc1619ddaca7a5faa833dcbca"} Apr 22 14:17:39.411279 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:39.411238 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j877" event={"ID":"3f2964c4-19d3-4dcc-b821-38a683bc38f7","Type":"ContainerStarted","Data":"cbdb64e00e78967a3867a65b9a8a1e68dae0ced069a3d19a2ce8e3d2e0334aac"} Apr 22 14:17:40.415164 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:40.415126 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-frhw5" event={"ID":"59e79879-c532-4a00-a584-9f807448ef98","Type":"ContainerStarted","Data":"3f0b95da55c2dc0bd1188d4e3dfa52ce4baa250d5e3c4d1984e49df894b24952"} Apr 22 14:17:40.416563 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:40.416533 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j877" event={"ID":"3f2964c4-19d3-4dcc-b821-38a683bc38f7","Type":"ContainerStarted","Data":"e36d4bf6739de453c17026d2c1dd4f2dd2991c2e109b9d7505197174451a2516"} Apr 22 14:17:40.416696 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:40.416680 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7j877" Apr 22 14:17:40.431776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:40.431731 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-frhw5" podStartSLOduration=129.481693054 podStartE2EDuration="2m11.431717123s" podCreationTimestamp="2026-04-22 14:15:29 +0000 UTC" firstStartedPulling="2026-04-22 14:17:37.73236279 +0000 UTC m=+161.431723463" lastFinishedPulling="2026-04-22 14:17:39.682386854 +0000 UTC m=+163.381747532" observedRunningTime="2026-04-22 14:17:40.430582259 +0000 UTC m=+164.129942940" watchObservedRunningTime="2026-04-22 14:17:40.431717123 +0000 UTC m=+164.131077805" Apr 22 14:17:44.679195 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.679101 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7j877" podStartSLOduration=134.381566193 podStartE2EDuration="2m15.679086906s" podCreationTimestamp="2026-04-22 14:15:29 +0000 UTC" firstStartedPulling="2026-04-22 14:17:37.751785889 +0000 UTC m=+161.451146549" lastFinishedPulling="2026-04-22 14:17:39.049306588 +0000 UTC m=+162.748667262" observedRunningTime="2026-04-22 14:17:40.448856279 +0000 UTC m=+164.148216962" watchObservedRunningTime="2026-04-22 14:17:44.679086906 +0000 UTC m=+168.378447588" Apr 22 14:17:44.679601 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.679586 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-j76jp"] Apr 22 14:17:44.682442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.682427 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.687364 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.687336 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:17:44.688798 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.688543 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4zp5l\"" Apr 22 14:17:44.688798 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.688557 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:17:44.688798 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.688725 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:17:44.690686 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.690666 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:17:44.731203 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.731173 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/65caa8fa-a4c2-4744-bb55-9df4683af02f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.731341 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.731221 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/65caa8fa-a4c2-4744-bb55-9df4683af02f-data-volume\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.731341 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.731244 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/65caa8fa-a4c2-4744-bb55-9df4683af02f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.731341 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.731298 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfmtz\" (UniqueName: \"kubernetes.io/projected/65caa8fa-a4c2-4744-bb55-9df4683af02f-kube-api-access-xfmtz\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.731441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.731345 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/65caa8fa-a4c2-4744-bb55-9df4683af02f-crio-socket\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.735476 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.735458 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j76jp"] Apr 22 14:17:44.824486 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.824454 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs"] Apr 22 14:17:44.827530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.827508 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:44.832424 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.832396 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/65caa8fa-a4c2-4744-bb55-9df4683af02f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.832533 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.832475 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/65caa8fa-a4c2-4744-bb55-9df4683af02f-data-volume\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.832533 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.832506 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/65caa8fa-a4c2-4744-bb55-9df4683af02f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.832660 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.832536 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfmtz\" (UniqueName: \"kubernetes.io/projected/65caa8fa-a4c2-4744-bb55-9df4683af02f-kube-api-access-xfmtz\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.832660 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.832571 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/65caa8fa-a4c2-4744-bb55-9df4683af02f-crio-socket\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.832744 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.832669 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/65caa8fa-a4c2-4744-bb55-9df4683af02f-crio-socket\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.832848 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.832828 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/65caa8fa-a4c2-4744-bb55-9df4683af02f-data-volume\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.833109 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.833084 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/65caa8fa-a4c2-4744-bb55-9df4683af02f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.834917 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.834900 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/65caa8fa-a4c2-4744-bb55-9df4683af02f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.835589 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.835571 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:17:44.840202 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.840183 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bgh6b\"" Apr 22 14:17:44.840306 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.840183 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:17:44.843047 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.843028 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:17:44.843397 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.843378 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:17:44.869719 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.869692 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs"] Apr 22 14:17:44.875379 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.875354 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfmtz\" (UniqueName: \"kubernetes.io/projected/65caa8fa-a4c2-4744-bb55-9df4683af02f-kube-api-access-xfmtz\") pod \"insights-runtime-extractor-j76jp\" (UID: \"65caa8fa-a4c2-4744-bb55-9df4683af02f\") " pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:44.924304 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.924279 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:17:44.932871 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.932824 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm42p\" (UniqueName: \"kubernetes.io/projected/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-kube-api-access-bm42p\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:44.932871 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.932859 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-image-registry-private-configuration\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:44.932981 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.932887 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-registry-certificates\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:44.932981 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.932929 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-ca-trust-extracted\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:44.932981 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.932957 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-trusted-ca\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:44.933134 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.932985 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-bound-sa-token\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:44.933134 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.933068 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-installation-pull-secrets\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:44.933134 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.933114 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-registry-tls\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:44.991845 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:44.991806 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j76jp" Apr 22 14:17:45.034322 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.034290 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bm42p\" (UniqueName: \"kubernetes.io/projected/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-kube-api-access-bm42p\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.034322 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.034341 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-image-registry-private-configuration\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.034613 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.034371 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-registry-certificates\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.034613 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.034389 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-ca-trust-extracted\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.034613 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.034432 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-trusted-ca\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.035099 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.034798 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-bound-sa-token\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.035099 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.034878 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-installation-pull-secrets\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.035099 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.034891 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-ca-trust-extracted\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.035099 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.034942 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-registry-tls\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.035438 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.035417 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-registry-certificates\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.035583 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.035561 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-trusted-ca\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.037674 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.037636 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-registry-tls\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.037790 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.037675 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-image-registry-private-configuration\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.037998 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.037979 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-installation-pull-secrets\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.052421 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.052341 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-bound-sa-token\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.056186 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.056155 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm42p\" (UniqueName: \"kubernetes.io/projected/5efc25d9-b6cb-482c-a5aa-bc50fea03e4f-kube-api-access-bm42p\") pod \"image-registry-5bdc9fd6c5-p8cbs\" (UID: \"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f\") " pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.132972 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.132937 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j76jp"] Apr 22 14:17:45.136157 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.136137 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.137797 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:17:45.137774 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65caa8fa_a4c2_4744_bb55_9df4683af02f.slice/crio-43373e50d4f4f338700a317efa8bfc162a3ffe75ab934176d167818bb1a7408d WatchSource:0}: Error finding container 43373e50d4f4f338700a317efa8bfc162a3ffe75ab934176d167818bb1a7408d: Status 404 returned error can't find the container with id 43373e50d4f4f338700a317efa8bfc162a3ffe75ab934176d167818bb1a7408d Apr 22 14:17:45.272411 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.272374 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs"] Apr 22 14:17:45.274413 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:17:45.274382 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5efc25d9_b6cb_482c_a5aa_bc50fea03e4f.slice/crio-9ae000b09a6ee8171a7711fbd985973a17d270c36252e18ba0b437933ed263dc WatchSource:0}: Error finding container 9ae000b09a6ee8171a7711fbd985973a17d270c36252e18ba0b437933ed263dc: Status 404 returned error can't find the container with id 9ae000b09a6ee8171a7711fbd985973a17d270c36252e18ba0b437933ed263dc Apr 22 14:17:45.432776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.432741 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" event={"ID":"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f","Type":"ContainerStarted","Data":"982de7bdf09d2a86280a207bc9d55393df755cefd3bd2f19f19cb733fa766559"} Apr 22 14:17:45.432776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.432782 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" event={"ID":"5efc25d9-b6cb-482c-a5aa-bc50fea03e4f","Type":"ContainerStarted","Data":"9ae000b09a6ee8171a7711fbd985973a17d270c36252e18ba0b437933ed263dc"} Apr 22 14:17:45.433026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.432841 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:17:45.434087 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.434044 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j76jp" event={"ID":"65caa8fa-a4c2-4744-bb55-9df4683af02f","Type":"ContainerStarted","Data":"fd995165c1c9c3b98b61dd77c2f73ffc3ff8b08e962bb6e5711c293e2093c25b"} Apr 22 14:17:45.434087 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:45.434086 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j76jp" event={"ID":"65caa8fa-a4c2-4744-bb55-9df4683af02f","Type":"ContainerStarted","Data":"43373e50d4f4f338700a317efa8bfc162a3ffe75ab934176d167818bb1a7408d"} Apr 22 14:17:46.438006 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:46.437966 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j76jp" event={"ID":"65caa8fa-a4c2-4744-bb55-9df4683af02f","Type":"ContainerStarted","Data":"7d440e61154410a41ee9103d0a590ca3caca14f1c25e90e47ec2dc7399ab0445"} Apr 22 14:17:47.037214 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:47.037161 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" podStartSLOduration=3.037146388 podStartE2EDuration="3.037146388s" podCreationTimestamp="2026-04-22 14:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:45.463343091 +0000 UTC m=+169.162703775" watchObservedRunningTime="2026-04-22 14:17:47.037146388 +0000 UTC m=+170.736507070" Apr 22 14:17:47.442173 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:47.442136 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j76jp" event={"ID":"65caa8fa-a4c2-4744-bb55-9df4683af02f","Type":"ContainerStarted","Data":"ac410179173da6d71731898e17b4f942c391d6c6ba4b648089007176f0431eb1"} Apr 22 14:17:47.483188 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:47.483137 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-j76jp" podStartSLOduration=1.387031274 podStartE2EDuration="3.483122615s" podCreationTimestamp="2026-04-22 14:17:44 +0000 UTC" firstStartedPulling="2026-04-22 14:17:45.205020236 +0000 UTC m=+168.904380897" lastFinishedPulling="2026-04-22 14:17:47.301111561 +0000 UTC m=+171.000472238" observedRunningTime="2026-04-22 14:17:47.481787033 +0000 UTC m=+171.181147715" watchObservedRunningTime="2026-04-22 14:17:47.483122615 +0000 UTC m=+171.182483311" Apr 22 14:17:50.421931 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:50.421898 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7j877" Apr 22 14:17:53.163928 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:53.163863 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" podUID="bffdf534-ddd9-465a-b5a1-fb1da38b61d4" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.8:8000/readyz\": dial tcp 10.133.0.8:8000: connect: connection refused" Apr 22 14:17:53.459045 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:53.458949 2562 generic.go:358] "Generic (PLEG): container finished" podID="bffdf534-ddd9-465a-b5a1-fb1da38b61d4" containerID="95ed59dcdb6f6c5d879b53b3f546bcb717e924e6605c062162c2b930bcfac5f6" exitCode=1 Apr 22 14:17:53.459045 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:53.459027 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" event={"ID":"bffdf534-ddd9-465a-b5a1-fb1da38b61d4","Type":"ContainerDied","Data":"95ed59dcdb6f6c5d879b53b3f546bcb717e924e6605c062162c2b930bcfac5f6"} Apr 22 14:17:53.459404 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:53.459388 2562 scope.go:117] "RemoveContainer" containerID="95ed59dcdb6f6c5d879b53b3f546bcb717e924e6605c062162c2b930bcfac5f6" Apr 22 14:17:54.463284 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:54.463245 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" event={"ID":"bffdf534-ddd9-465a-b5a1-fb1da38b61d4","Type":"ContainerStarted","Data":"6d84117ae2cc4d1d4ce42d20e59850015ee8e0630d0554565e2702cf5db8af57"} Apr 22 14:17:54.463671 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:54.463527 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:17:54.464133 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:17:54.464115 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86f5bddf6b-mvzpg" Apr 22 14:18:06.442173 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:06.442140 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5bdc9fd6c5-p8cbs" Apr 22 14:18:08.729679 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.729630 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6lbp8"] Apr 22 14:18:08.734067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.734044 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.736772 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.736749 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:18:08.736925 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.736822 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:18:08.738033 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.737980 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:18:08.738033 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.737996 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:18:08.738213 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.738091 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2pzx2\"" Apr 22 14:18:08.738213 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.738100 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:18:08.738213 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.738125 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:18:08.818873 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.818832 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-accelerators-collector-config\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.818873 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.818874 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hngb\" (UniqueName: \"kubernetes.io/projected/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-kube-api-access-6hngb\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.819077 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.818896 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-sys\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.819077 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.818983 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-root\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.819077 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.819013 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-metrics-client-ca\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.819077 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.819064 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-textfile\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.819199 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.819080 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-wtmp\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.819199 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.819102 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-tls\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.819199 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.819162 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920076 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920040 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-accelerators-collector-config\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920076 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920078 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hngb\" (UniqueName: \"kubernetes.io/projected/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-kube-api-access-6hngb\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920269 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920097 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-sys\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920269 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920135 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-root\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920269 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920150 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-metrics-client-ca\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920269 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920177 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-textfile\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920269 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920192 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-wtmp\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920269 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920213 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-tls\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920269 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920237 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-root\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920269 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920253 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920753 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920271 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-sys\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920753 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920378 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-wtmp\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920753 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920681 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-textfile\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920753 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920727 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-accelerators-collector-config\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.920956 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.920823 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-metrics-client-ca\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.922544 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.922524 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.923090 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.923065 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-node-exporter-tls\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:08.929135 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:08.929108 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hngb\" (UniqueName: \"kubernetes.io/projected/d3fa9b97-6462-4371-bd9c-fbfb49153cf7-kube-api-access-6hngb\") pod \"node-exporter-6lbp8\" (UID: \"d3fa9b97-6462-4371-bd9c-fbfb49153cf7\") " pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:09.042950 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:09.042860 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6lbp8" Apr 22 14:18:09.050686 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:18:09.050643 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3fa9b97_6462_4371_bd9c_fbfb49153cf7.slice/crio-9011749e97dc35e88e22da544c1e5fbe75f2ad48d180a2911942730dc1845771 WatchSource:0}: Error finding container 9011749e97dc35e88e22da544c1e5fbe75f2ad48d180a2911942730dc1845771: Status 404 returned error can't find the container with id 9011749e97dc35e88e22da544c1e5fbe75f2ad48d180a2911942730dc1845771 Apr 22 14:18:09.503308 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:09.503265 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6lbp8" event={"ID":"d3fa9b97-6462-4371-bd9c-fbfb49153cf7","Type":"ContainerStarted","Data":"9011749e97dc35e88e22da544c1e5fbe75f2ad48d180a2911942730dc1845771"} Apr 22 14:18:10.507161 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:10.507125 2562 generic.go:358] "Generic (PLEG): container finished" podID="d3fa9b97-6462-4371-bd9c-fbfb49153cf7" containerID="e91de7776cb16cd9d90f26c2955da517442be0753e1f245e4a813d931727b53a" exitCode=0 Apr 22 14:18:10.507516 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:10.507209 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6lbp8" event={"ID":"d3fa9b97-6462-4371-bd9c-fbfb49153cf7","Type":"ContainerDied","Data":"e91de7776cb16cd9d90f26c2955da517442be0753e1f245e4a813d931727b53a"} Apr 22 14:18:11.511147 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:11.511112 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6lbp8" event={"ID":"d3fa9b97-6462-4371-bd9c-fbfb49153cf7","Type":"ContainerStarted","Data":"127bf88f97e51329a27d71d12a03434cc0a3c1483dabc561c5f56630f62e63be"} Apr 22 14:18:11.511532 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:11.511154 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6lbp8" event={"ID":"d3fa9b97-6462-4371-bd9c-fbfb49153cf7","Type":"ContainerStarted","Data":"47aac9dbee592718ab11f876a8654fb537a1c6b1be99c8d46e9680b72d9a2bf2"} Apr 22 14:18:11.535314 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:11.535262 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6lbp8" podStartSLOduration=2.838124033 podStartE2EDuration="3.535247621s" podCreationTimestamp="2026-04-22 14:18:08 +0000 UTC" firstStartedPulling="2026-04-22 14:18:09.052441633 +0000 UTC m=+192.751802309" lastFinishedPulling="2026-04-22 14:18:09.749565206 +0000 UTC m=+193.448925897" observedRunningTime="2026-04-22 14:18:11.533980502 +0000 UTC m=+195.233341183" watchObservedRunningTime="2026-04-22 14:18:11.535247621 +0000 UTC m=+195.234608304" Apr 22 14:18:13.796697 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.796639 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp"] Apr 22 14:18:13.800367 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.800337 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:13.806303 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.806281 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 14:18:13.806512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.806494 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 14:18:13.806595 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.806546 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 14:18:13.806595 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.806549 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 14:18:13.807611 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.807594 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-xnlqq\"" Apr 22 14:18:13.808252 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.808237 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 14:18:13.812202 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.812186 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 14:18:13.821257 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.821233 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp"] Apr 22 14:18:13.963566 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.963512 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0038ac9f-6cc0-4497-8891-cb1708f6d62d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:13.963566 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.963576 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-secret-telemeter-client\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:13.963810 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.963603 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0038ac9f-6cc0-4497-8891-cb1708f6d62d-serving-certs-ca-bundle\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:13.963810 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.963626 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-telemeter-client-tls\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:13.963810 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.963643 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0038ac9f-6cc0-4497-8891-cb1708f6d62d-metrics-client-ca\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:13.963810 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.963768 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jpdz\" (UniqueName: \"kubernetes.io/projected/0038ac9f-6cc0-4497-8891-cb1708f6d62d-kube-api-access-6jpdz\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:13.963949 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.963831 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-federate-client-tls\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:13.963949 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:13.963857 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.064823 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.064742 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-telemeter-client-tls\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.064823 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.064782 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0038ac9f-6cc0-4497-8891-cb1708f6d62d-metrics-client-ca\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.064823 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.064817 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jpdz\" (UniqueName: \"kubernetes.io/projected/0038ac9f-6cc0-4497-8891-cb1708f6d62d-kube-api-access-6jpdz\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.064998 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.064858 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-federate-client-tls\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.064998 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.064885 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.064998 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.064932 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0038ac9f-6cc0-4497-8891-cb1708f6d62d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.065135 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.065056 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-secret-telemeter-client\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.065135 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.065101 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0038ac9f-6cc0-4497-8891-cb1708f6d62d-serving-certs-ca-bundle\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.065638 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.065609 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0038ac9f-6cc0-4497-8891-cb1708f6d62d-metrics-client-ca\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.065825 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.065799 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0038ac9f-6cc0-4497-8891-cb1708f6d62d-serving-certs-ca-bundle\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.065969 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.065948 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0038ac9f-6cc0-4497-8891-cb1708f6d62d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.067499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.067474 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-secret-telemeter-client\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.067585 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.067472 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.067893 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.067871 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-telemeter-client-tls\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.067947 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.067871 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0038ac9f-6cc0-4497-8891-cb1708f6d62d-federate-client-tls\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.076251 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.076230 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jpdz\" (UniqueName: \"kubernetes.io/projected/0038ac9f-6cc0-4497-8891-cb1708f6d62d-kube-api-access-6jpdz\") pod \"telemeter-client-74b7dc4dd5-42lxp\" (UID: \"0038ac9f-6cc0-4497-8891-cb1708f6d62d\") " pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.109784 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.109752 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" Apr 22 14:18:14.232504 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.232470 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp"] Apr 22 14:18:14.235584 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:18:14.235559 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0038ac9f_6cc0_4497_8891_cb1708f6d62d.slice/crio-211b6a988cc2186af7c575abc7899b6aa0d370cbb72fb14f4de3923764efc0b6 WatchSource:0}: Error finding container 211b6a988cc2186af7c575abc7899b6aa0d370cbb72fb14f4de3923764efc0b6: Status 404 returned error can't find the container with id 211b6a988cc2186af7c575abc7899b6aa0d370cbb72fb14f4de3923764efc0b6 Apr 22 14:18:14.521085 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:14.521051 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" event={"ID":"0038ac9f-6cc0-4497-8891-cb1708f6d62d","Type":"ContainerStarted","Data":"211b6a988cc2186af7c575abc7899b6aa0d370cbb72fb14f4de3923764efc0b6"} Apr 22 14:18:16.528035 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:16.527943 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" event={"ID":"0038ac9f-6cc0-4497-8891-cb1708f6d62d","Type":"ContainerStarted","Data":"ce51ece6024caca587992cf08c752f5573bc0e9dc12800cf6c53485e0372cc12"} Apr 22 14:18:17.072497 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:17.072448 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" podUID="c063fba5-355f-415e-a445-6b6f66bb8213" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:18:17.532537 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:17.532499 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" event={"ID":"0038ac9f-6cc0-4497-8891-cb1708f6d62d","Type":"ContainerStarted","Data":"0e3073f9376ebae3b700863075289b94c88a7d040c2fed7bdaec9ff8bea8d11c"} Apr 22 14:18:17.532537 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:17.532539 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" event={"ID":"0038ac9f-6cc0-4497-8891-cb1708f6d62d","Type":"ContainerStarted","Data":"26420f75b458d4d38a53c711e724f14c9395ba5a88d6cf5f7b5edf8052cca125"} Apr 22 14:18:17.561189 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:17.561136 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-74b7dc4dd5-42lxp" podStartSLOduration=1.6038552 podStartE2EDuration="4.561119981s" podCreationTimestamp="2026-04-22 14:18:13 +0000 UTC" firstStartedPulling="2026-04-22 14:18:14.237305759 +0000 UTC m=+197.936666419" lastFinishedPulling="2026-04-22 14:18:17.19457054 +0000 UTC m=+200.893931200" observedRunningTime="2026-04-22 14:18:17.559148444 +0000 UTC m=+201.258509126" watchObservedRunningTime="2026-04-22 14:18:17.561119981 +0000 UTC m=+201.260480662" Apr 22 14:18:27.072172 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:27.072130 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" podUID="c063fba5-355f-415e-a445-6b6f66bb8213" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:18:37.072061 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:37.072010 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" podUID="c063fba5-355f-415e-a445-6b6f66bb8213" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:18:37.072533 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:37.072111 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" Apr 22 14:18:37.072760 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:37.072717 2562 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"2b1b6caca80282a3b87107498ff38e653e4d19b1fbf7a42d18952c20eeb479ca"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 14:18:37.072854 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:37.072833 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" podUID="c063fba5-355f-415e-a445-6b6f66bb8213" containerName="service-proxy" containerID="cri-o://2b1b6caca80282a3b87107498ff38e653e4d19b1fbf7a42d18952c20eeb479ca" gracePeriod=30 Apr 22 14:18:37.583055 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:37.583019 2562 generic.go:358] "Generic (PLEG): container finished" podID="c063fba5-355f-415e-a445-6b6f66bb8213" containerID="2b1b6caca80282a3b87107498ff38e653e4d19b1fbf7a42d18952c20eeb479ca" exitCode=2 Apr 22 14:18:37.583302 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:37.583094 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" event={"ID":"c063fba5-355f-415e-a445-6b6f66bb8213","Type":"ContainerDied","Data":"2b1b6caca80282a3b87107498ff38e653e4d19b1fbf7a42d18952c20eeb479ca"} Apr 22 14:18:37.583302 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:18:37.583130 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7db4dbf7db-v8m67" event={"ID":"c063fba5-355f-415e-a445-6b6f66bb8213","Type":"ContainerStarted","Data":"c46d67dcc30427aee80865a9a3bab6ba20ef29ce540d60cf83953918635f2462"} Apr 22 14:19:08.799478 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:08.799435 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:19:08.801813 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:08.801788 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db11d8cb-718e-49f4-a019-bc36f8a9af79-metrics-certs\") pod \"network-metrics-daemon-7pz2p\" (UID: \"db11d8cb-718e-49f4-a019-bc36f8a9af79\") " pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:19:08.927575 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:08.927543 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4nhsh\"" Apr 22 14:19:08.935877 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:08.935852 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pz2p" Apr 22 14:19:09.054682 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:09.054583 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7pz2p"] Apr 22 14:19:09.057719 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:19:09.057689 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb11d8cb_718e_49f4_a019_bc36f8a9af79.slice/crio-6755af97fb36d4b48151b0d6ba9b0cc4249502d6bd93e127cb81caee406478cd WatchSource:0}: Error finding container 6755af97fb36d4b48151b0d6ba9b0cc4249502d6bd93e127cb81caee406478cd: Status 404 returned error can't find the container with id 6755af97fb36d4b48151b0d6ba9b0cc4249502d6bd93e127cb81caee406478cd Apr 22 14:19:09.667497 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:09.667452 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7pz2p" event={"ID":"db11d8cb-718e-49f4-a019-bc36f8a9af79","Type":"ContainerStarted","Data":"6755af97fb36d4b48151b0d6ba9b0cc4249502d6bd93e127cb81caee406478cd"} Apr 22 14:19:10.672263 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:10.672221 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7pz2p" event={"ID":"db11d8cb-718e-49f4-a019-bc36f8a9af79","Type":"ContainerStarted","Data":"54bc573ebc900311fbe5f825a364b3be4e1a53f2c36eb5325d1fa2cacc363514"} Apr 22 14:19:10.672263 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:10.672269 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7pz2p" event={"ID":"db11d8cb-718e-49f4-a019-bc36f8a9af79","Type":"ContainerStarted","Data":"b6e9dd9ea4283ee19e900f53abb9d6dc065f5e38cc36736988e07b17b4750063"} Apr 22 14:19:10.690048 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:10.690005 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7pz2p" podStartSLOduration=252.712400647 podStartE2EDuration="4m13.689992107s" podCreationTimestamp="2026-04-22 14:14:57 +0000 UTC" firstStartedPulling="2026-04-22 14:19:09.059894786 +0000 UTC m=+252.759255445" lastFinishedPulling="2026-04-22 14:19:10.037486244 +0000 UTC m=+253.736846905" observedRunningTime="2026-04-22 14:19:10.689788142 +0000 UTC m=+254.389148815" watchObservedRunningTime="2026-04-22 14:19:10.689992107 +0000 UTC m=+254.389352789" Apr 22 14:19:56.808542 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:56.808514 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:19:56.809362 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:56.809340 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:19:56.813068 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:19:56.813045 2562 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:20:40.623511 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.623474 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c"] Apr 22 14:20:40.626756 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.626732 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.629626 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.629600 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 14:20:40.629757 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.629739 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 14:20:40.630852 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.630834 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6bzv\"" Apr 22 14:20:40.635439 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.635411 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c"] Apr 22 14:20:40.762573 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.762522 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.762573 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.762575 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grjqg\" (UniqueName: \"kubernetes.io/projected/21edcd99-b549-488e-ad69-561156e7ccb8-kube-api-access-grjqg\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.762846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.762693 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.864001 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.863909 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.864154 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.864019 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grjqg\" (UniqueName: \"kubernetes.io/projected/21edcd99-b549-488e-ad69-561156e7ccb8-kube-api-access-grjqg\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.864154 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.864058 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.864345 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.864321 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.864345 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.864339 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.873025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.873005 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grjqg\" (UniqueName: \"kubernetes.io/projected/21edcd99-b549-488e-ad69-561156e7ccb8-kube-api-access-grjqg\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:40.936690 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:40.936589 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:41.055449 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:41.055422 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c"] Apr 22 14:20:41.057562 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:20:41.057530 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21edcd99_b549_488e_ad69_561156e7ccb8.slice/crio-ac8c7767dcfba72dbb3ea359f40fa4cee99395d195bd65c4ac26eb2dbb6d9364 WatchSource:0}: Error finding container ac8c7767dcfba72dbb3ea359f40fa4cee99395d195bd65c4ac26eb2dbb6d9364: Status 404 returned error can't find the container with id ac8c7767dcfba72dbb3ea359f40fa4cee99395d195bd65c4ac26eb2dbb6d9364 Apr 22 14:20:41.059396 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:41.059381 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:20:41.911151 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:41.911110 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" event={"ID":"21edcd99-b549-488e-ad69-561156e7ccb8","Type":"ContainerStarted","Data":"ac8c7767dcfba72dbb3ea359f40fa4cee99395d195bd65c4ac26eb2dbb6d9364"} Apr 22 14:20:45.922843 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:45.922805 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" event={"ID":"21edcd99-b549-488e-ad69-561156e7ccb8","Type":"ContainerStarted","Data":"f2ac53896a3e6dacba4244dbafbecb566f2a0e0ebd0d811d3dee753e1baf4bd9"} Apr 22 14:20:46.927028 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:46.926986 2562 generic.go:358] "Generic (PLEG): container finished" podID="21edcd99-b549-488e-ad69-561156e7ccb8" containerID="f2ac53896a3e6dacba4244dbafbecb566f2a0e0ebd0d811d3dee753e1baf4bd9" exitCode=0 Apr 22 14:20:46.928136 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:46.927829 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" event={"ID":"21edcd99-b549-488e-ad69-561156e7ccb8","Type":"ContainerDied","Data":"f2ac53896a3e6dacba4244dbafbecb566f2a0e0ebd0d811d3dee753e1baf4bd9"} Apr 22 14:20:49.936352 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:49.936316 2562 generic.go:358] "Generic (PLEG): container finished" podID="21edcd99-b549-488e-ad69-561156e7ccb8" containerID="ffe39b9a8e22026b6373cc9d75c4c90ccd332a71e45763e1b3a2c9816b6e5798" exitCode=0 Apr 22 14:20:49.936732 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:49.936402 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" event={"ID":"21edcd99-b549-488e-ad69-561156e7ccb8","Type":"ContainerDied","Data":"ffe39b9a8e22026b6373cc9d75c4c90ccd332a71e45763e1b3a2c9816b6e5798"} Apr 22 14:20:56.959119 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:56.959069 2562 generic.go:358] "Generic (PLEG): container finished" podID="21edcd99-b549-488e-ad69-561156e7ccb8" containerID="b9283d074ca152f6741597c1616dfdc28f04adc74eb5d23e689315550eefab16" exitCode=0 Apr 22 14:20:56.959472 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:56.959129 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" event={"ID":"21edcd99-b549-488e-ad69-561156e7ccb8","Type":"ContainerDied","Data":"b9283d074ca152f6741597c1616dfdc28f04adc74eb5d23e689315550eefab16"} Apr 22 14:20:58.084488 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.084460 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:20:58.109471 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.109445 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grjqg\" (UniqueName: \"kubernetes.io/projected/21edcd99-b549-488e-ad69-561156e7ccb8-kube-api-access-grjqg\") pod \"21edcd99-b549-488e-ad69-561156e7ccb8\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " Apr 22 14:20:58.109624 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.109548 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-bundle\") pod \"21edcd99-b549-488e-ad69-561156e7ccb8\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " Apr 22 14:20:58.109624 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.109609 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-util\") pod \"21edcd99-b549-488e-ad69-561156e7ccb8\" (UID: \"21edcd99-b549-488e-ad69-561156e7ccb8\") " Apr 22 14:20:58.110255 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.110220 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-bundle" (OuterVolumeSpecName: "bundle") pod "21edcd99-b549-488e-ad69-561156e7ccb8" (UID: "21edcd99-b549-488e-ad69-561156e7ccb8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:20:58.111744 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.111713 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21edcd99-b549-488e-ad69-561156e7ccb8-kube-api-access-grjqg" (OuterVolumeSpecName: "kube-api-access-grjqg") pod "21edcd99-b549-488e-ad69-561156e7ccb8" (UID: "21edcd99-b549-488e-ad69-561156e7ccb8"). InnerVolumeSpecName "kube-api-access-grjqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:20:58.113554 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.113533 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-util" (OuterVolumeSpecName: "util") pod "21edcd99-b549-488e-ad69-561156e7ccb8" (UID: "21edcd99-b549-488e-ad69-561156e7ccb8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:20:58.210713 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.210676 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-util\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:20:58.210713 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.210709 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grjqg\" (UniqueName: \"kubernetes.io/projected/21edcd99-b549-488e-ad69-561156e7ccb8-kube-api-access-grjqg\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:20:58.210713 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.210722 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21edcd99-b549-488e-ad69-561156e7ccb8-bundle\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:20:58.966506 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.966473 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" event={"ID":"21edcd99-b549-488e-ad69-561156e7ccb8","Type":"ContainerDied","Data":"ac8c7767dcfba72dbb3ea359f40fa4cee99395d195bd65c4ac26eb2dbb6d9364"} Apr 22 14:20:58.966506 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.966504 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac8c7767dcfba72dbb3ea359f40fa4cee99395d195bd65c4ac26eb2dbb6d9364" Apr 22 14:20:58.966723 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:20:58.966529 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxrl4c" Apr 22 14:21:07.512233 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.512202 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f"] Apr 22 14:21:07.512705 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.512444 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21edcd99-b549-488e-ad69-561156e7ccb8" containerName="util" Apr 22 14:21:07.512705 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.512455 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="21edcd99-b549-488e-ad69-561156e7ccb8" containerName="util" Apr 22 14:21:07.512705 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.512473 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21edcd99-b549-488e-ad69-561156e7ccb8" containerName="pull" Apr 22 14:21:07.512705 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.512480 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="21edcd99-b549-488e-ad69-561156e7ccb8" containerName="pull" Apr 22 14:21:07.512705 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.512491 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21edcd99-b549-488e-ad69-561156e7ccb8" containerName="extract" Apr 22 14:21:07.512705 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.512498 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="21edcd99-b549-488e-ad69-561156e7ccb8" containerName="extract" Apr 22 14:21:07.512705 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.512535 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="21edcd99-b549-488e-ad69-561156e7ccb8" containerName="extract" Apr 22 14:21:07.519510 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.519492 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:07.522611 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.522588 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 14:21:07.522753 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.522677 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 14:21:07.523848 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.523833 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-pkkkc\"" Apr 22 14:21:07.523910 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.523848 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 14:21:07.523950 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.523903 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 14:21:07.523993 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.523945 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 14:21:07.526798 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.526777 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f"] Apr 22 14:21:07.571424 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.571382 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2sv9\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-kube-api-access-n2sv9\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:07.571424 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.571428 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:07.571612 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.571508 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/be8441da-c93a-4948-82b7-8962f68d8a8b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:07.671855 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.671824 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2sv9\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-kube-api-access-n2sv9\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:07.671855 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.671858 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:07.672092 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.671880 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/be8441da-c93a-4948-82b7-8962f68d8a8b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:07.672092 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.671964 2562 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:07.672092 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.671979 2562 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:07.672092 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.671993 2562 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 22 14:21:07.672092 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.672002 2562 projected.go:302] Couldn't get configMap payload openshift-keda/keda-ocp-cabundle: configmap references non-existent config key: service-ca.crt Apr 22 14:21:07.672092 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.672014 2562 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:07.672092 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.672076 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates podName:be8441da-c93a-4948-82b7-8962f68d8a8b nodeName:}" failed. No retries permitted until 2026-04-22 14:21:08.172058599 +0000 UTC m=+371.871419262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates") pod "keda-metrics-apiserver-7c9f485588-xkk2f" (UID: "be8441da-c93a-4948-82b7-8962f68d8a8b") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:07.672425 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.672185 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/be8441da-c93a-4948-82b7-8962f68d8a8b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:07.683139 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.683107 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2sv9\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-kube-api-access-n2sv9\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:07.862878 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.862836 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-825ph"] Apr 22 14:21:07.866080 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.866064 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:07.868810 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.868791 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 14:21:07.872814 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.872787 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dts7s\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-kube-api-access-dts7s\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:07.872947 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.872852 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:07.874255 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.874230 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-825ph"] Apr 22 14:21:07.974116 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.974078 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dts7s\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-kube-api-access-dts7s\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:07.974315 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.974131 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:07.974315 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.974237 2562 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 14:21:07.974315 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.974259 2562 projected.go:302] Couldn't get configMap payload openshift-keda/keda-ocp-cabundle: configmap references non-existent config key: service-ca.crt Apr 22 14:21:07.974315 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.974272 2562 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-825ph: [secret "keda-admission-webhooks-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:07.974513 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:07.974345 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates podName:e4c87a38-d836-4458-913a-77aeab562a7f nodeName:}" failed. No retries permitted until 2026-04-22 14:21:08.474325036 +0000 UTC m=+372.173685696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates") pod "keda-admission-cf49989db-825ph" (UID: "e4c87a38-d836-4458-913a-77aeab562a7f") : [secret "keda-admission-webhooks-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:07.988282 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:07.988252 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dts7s\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-kube-api-access-dts7s\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:08.175828 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:08.175711 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:08.176010 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.175862 2562 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:08.176010 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.175883 2562 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:08.176010 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.175902 2562 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 22 14:21:08.176010 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.175915 2562 projected.go:302] Couldn't get configMap payload openshift-keda/keda-ocp-cabundle: configmap references non-existent config key: service-ca.crt Apr 22 14:21:08.176010 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.175928 2562 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:08.176010 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.175995 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates podName:be8441da-c93a-4948-82b7-8962f68d8a8b nodeName:}" failed. No retries permitted until 2026-04-22 14:21:09.175977702 +0000 UTC m=+372.875338375 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates") pod "keda-metrics-apiserver-7c9f485588-xkk2f" (UID: "be8441da-c93a-4948-82b7-8962f68d8a8b") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:08.478785 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:08.478688 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:08.478965 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.478850 2562 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 14:21:08.478965 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.478877 2562 projected.go:302] Couldn't get configMap payload openshift-keda/keda-ocp-cabundle: configmap references non-existent config key: service-ca.crt Apr 22 14:21:08.478965 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.478890 2562 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-825ph: [secret "keda-admission-webhooks-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:08.478965 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:08.478952 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates podName:e4c87a38-d836-4458-913a-77aeab562a7f nodeName:}" failed. No retries permitted until 2026-04-22 14:21:09.47893406 +0000 UTC m=+373.178294724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates") pod "keda-admission-cf49989db-825ph" (UID: "e4c87a38-d836-4458-913a-77aeab562a7f") : [secret "keda-admission-webhooks-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:09.183608 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:09.183567 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:09.184017 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.183709 2562 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:09.184017 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.183721 2562 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:09.184017 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.183735 2562 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 22 14:21:09.184017 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.183745 2562 projected.go:302] Couldn't get configMap payload openshift-keda/keda-ocp-cabundle: configmap references non-existent config key: service-ca.crt Apr 22 14:21:09.184017 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.183754 2562 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:09.184017 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.183812 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates podName:be8441da-c93a-4948-82b7-8962f68d8a8b nodeName:}" failed. No retries permitted until 2026-04-22 14:21:11.183800302 +0000 UTC m=+374.883160962 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates") pod "keda-metrics-apiserver-7c9f485588-xkk2f" (UID: "be8441da-c93a-4948-82b7-8962f68d8a8b") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:09.486279 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:09.486179 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:09.486424 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.486324 2562 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 14:21:09.486424 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.486351 2562 projected.go:302] Couldn't get configMap payload openshift-keda/keda-ocp-cabundle: configmap references non-existent config key: service-ca.crt Apr 22 14:21:09.486424 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.486363 2562 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-825ph: [secret "keda-admission-webhooks-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:09.486424 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:09.486415 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates podName:e4c87a38-d836-4458-913a-77aeab562a7f nodeName:}" failed. No retries permitted until 2026-04-22 14:21:11.486401097 +0000 UTC m=+375.185761758 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates") pod "keda-admission-cf49989db-825ph" (UID: "e4c87a38-d836-4458-913a-77aeab562a7f") : [secret "keda-admission-webhooks-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:11.199388 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:11.199353 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:11.199797 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:11.199470 2562 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 22 14:21:11.199797 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:11.199485 2562 projected.go:302] Couldn't get configMap payload openshift-keda/keda-ocp-cabundle: configmap references non-existent config key: service-ca.crt Apr 22 14:21:11.199797 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:11.199496 2562 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f: [secret "keda-metrics-apiserver-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:11.199797 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:11.199555 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates podName:be8441da-c93a-4948-82b7-8962f68d8a8b nodeName:}" failed. No retries permitted until 2026-04-22 14:21:15.199540249 +0000 UTC m=+378.898900909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates") pod "keda-metrics-apiserver-7c9f485588-xkk2f" (UID: "be8441da-c93a-4948-82b7-8962f68d8a8b") : [secret "keda-metrics-apiserver-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:11.502539 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:11.502434 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:11.502721 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:11.502592 2562 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 14:21:11.502721 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:11.502613 2562 projected.go:302] Couldn't get configMap payload openshift-keda/keda-ocp-cabundle: configmap references non-existent config key: service-ca.crt Apr 22 14:21:11.502721 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:11.502624 2562 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-825ph: [secret "keda-admission-webhooks-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:11.502721 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:21:11.502696 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates podName:e4c87a38-d836-4458-913a-77aeab562a7f nodeName:}" failed. No retries permitted until 2026-04-22 14:21:15.502679878 +0000 UTC m=+379.202040538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates") pod "keda-admission-cf49989db-825ph" (UID: "e4c87a38-d836-4458-913a-77aeab562a7f") : [secret "keda-admission-webhooks-certs" not found, configmap references non-existent config key: service-ca.crt] Apr 22 14:21:15.232559 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:15.232518 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:15.235060 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:15.235027 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/be8441da-c93a-4948-82b7-8962f68d8a8b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xkk2f\" (UID: \"be8441da-c93a-4948-82b7-8962f68d8a8b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:15.330133 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:15.330086 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:15.443991 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:15.443965 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f"] Apr 22 14:21:15.445994 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:21:15.445967 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe8441da_c93a_4948_82b7_8962f68d8a8b.slice/crio-1dc2d4fdc510e54cb0a3f67df6aef2860fbe46b90b900df6809d442aabe20c5d WatchSource:0}: Error finding container 1dc2d4fdc510e54cb0a3f67df6aef2860fbe46b90b900df6809d442aabe20c5d: Status 404 returned error can't find the container with id 1dc2d4fdc510e54cb0a3f67df6aef2860fbe46b90b900df6809d442aabe20c5d Apr 22 14:21:15.535013 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:15.534929 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:15.537314 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:15.537282 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4c87a38-d836-4458-913a-77aeab562a7f-certificates\") pod \"keda-admission-cf49989db-825ph\" (UID: \"e4c87a38-d836-4458-913a-77aeab562a7f\") " pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:15.676525 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:15.676482 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:15.792151 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:15.792127 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-825ph"] Apr 22 14:21:15.794322 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:21:15.794295 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4c87a38_d836_4458_913a_77aeab562a7f.slice/crio-f5ae501522263918be3ff1e9c1125c5a4c2bf7299d63428d64c184902b49af4e WatchSource:0}: Error finding container f5ae501522263918be3ff1e9c1125c5a4c2bf7299d63428d64c184902b49af4e: Status 404 returned error can't find the container with id f5ae501522263918be3ff1e9c1125c5a4c2bf7299d63428d64c184902b49af4e Apr 22 14:21:16.012328 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:16.012291 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" event={"ID":"be8441da-c93a-4948-82b7-8962f68d8a8b","Type":"ContainerStarted","Data":"1dc2d4fdc510e54cb0a3f67df6aef2860fbe46b90b900df6809d442aabe20c5d"} Apr 22 14:21:16.013299 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:16.013270 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-825ph" event={"ID":"e4c87a38-d836-4458-913a-77aeab562a7f","Type":"ContainerStarted","Data":"f5ae501522263918be3ff1e9c1125c5a4c2bf7299d63428d64c184902b49af4e"} Apr 22 14:21:19.023163 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:19.023118 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" event={"ID":"be8441da-c93a-4948-82b7-8962f68d8a8b","Type":"ContainerStarted","Data":"622908f654f9253c757cef4cb6232f95c19aa59b2fc13138ec59c15908046df1"} Apr 22 14:21:19.023591 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:19.023223 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:19.024317 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:19.024296 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-825ph" event={"ID":"e4c87a38-d836-4458-913a-77aeab562a7f","Type":"ContainerStarted","Data":"cff64408f9945055741a681493b0039ad5f1a8f9e4f697f0a2a69af16a869d88"} Apr 22 14:21:19.024440 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:19.024429 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:21:19.043735 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:19.043688 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" podStartSLOduration=9.465297463 podStartE2EDuration="12.043676577s" podCreationTimestamp="2026-04-22 14:21:07 +0000 UTC" firstStartedPulling="2026-04-22 14:21:15.447242862 +0000 UTC m=+379.146603522" lastFinishedPulling="2026-04-22 14:21:18.025621962 +0000 UTC m=+381.724982636" observedRunningTime="2026-04-22 14:21:19.043012242 +0000 UTC m=+382.742372929" watchObservedRunningTime="2026-04-22 14:21:19.043676577 +0000 UTC m=+382.743037317" Apr 22 14:21:19.059873 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:19.059835 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-825ph" podStartSLOduration=9.836491016 podStartE2EDuration="12.059824872s" podCreationTimestamp="2026-04-22 14:21:07 +0000 UTC" firstStartedPulling="2026-04-22 14:21:15.795667817 +0000 UTC m=+379.495028477" lastFinishedPulling="2026-04-22 14:21:18.019001674 +0000 UTC m=+381.718362333" observedRunningTime="2026-04-22 14:21:19.059156631 +0000 UTC m=+382.758517314" watchObservedRunningTime="2026-04-22 14:21:19.059824872 +0000 UTC m=+382.759185554" Apr 22 14:21:30.031076 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:30.031045 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xkk2f" Apr 22 14:21:40.029055 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:21:40.029021 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-825ph" Apr 22 14:22:01.636942 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.636907 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k"] Apr 22 14:22:01.643110 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.643087 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.646079 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.646053 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 14:22:01.646207 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.646121 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6bzv\"" Apr 22 14:22:01.647157 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.647135 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 14:22:01.650112 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.650083 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k"] Apr 22 14:22:01.784019 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.783983 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.784181 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.784037 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.784181 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.784082 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmm7l\" (UniqueName: \"kubernetes.io/projected/76733d07-dc17-4382-9e2a-e8326a5384ee-kube-api-access-zmm7l\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.885049 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.885000 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.885176 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.885070 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.885176 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.885100 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmm7l\" (UniqueName: \"kubernetes.io/projected/76733d07-dc17-4382-9e2a-e8326a5384ee-kube-api-access-zmm7l\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.885393 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.885370 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.885482 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.885459 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.894505 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.894439 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmm7l\" (UniqueName: \"kubernetes.io/projected/76733d07-dc17-4382-9e2a-e8326a5384ee-kube-api-access-zmm7l\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:01.952847 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:01.952811 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:02.067764 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:02.067738 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k"] Apr 22 14:22:02.069675 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:22:02.069636 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76733d07_dc17_4382_9e2a_e8326a5384ee.slice/crio-9ff9308c49687e7e5637158d2c11f5085c72c5decc0df6c143df9a561da6b96b WatchSource:0}: Error finding container 9ff9308c49687e7e5637158d2c11f5085c72c5decc0df6c143df9a561da6b96b: Status 404 returned error can't find the container with id 9ff9308c49687e7e5637158d2c11f5085c72c5decc0df6c143df9a561da6b96b Apr 22 14:22:02.140143 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:02.140115 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" event={"ID":"76733d07-dc17-4382-9e2a-e8326a5384ee","Type":"ContainerStarted","Data":"a7c1f53f99d3f642d28758672f99fa0967060197695d4fd2a3727d3a0fca5d37"} Apr 22 14:22:02.140249 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:02.140148 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" event={"ID":"76733d07-dc17-4382-9e2a-e8326a5384ee","Type":"ContainerStarted","Data":"9ff9308c49687e7e5637158d2c11f5085c72c5decc0df6c143df9a561da6b96b"} Apr 22 14:22:03.143594 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:03.143554 2562 generic.go:358] "Generic (PLEG): container finished" podID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerID="a7c1f53f99d3f642d28758672f99fa0967060197695d4fd2a3727d3a0fca5d37" exitCode=0 Apr 22 14:22:03.143961 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:03.143644 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" event={"ID":"76733d07-dc17-4382-9e2a-e8326a5384ee","Type":"ContainerDied","Data":"a7c1f53f99d3f642d28758672f99fa0967060197695d4fd2a3727d3a0fca5d37"} Apr 22 14:22:05.151440 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:05.151402 2562 generic.go:358] "Generic (PLEG): container finished" podID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerID="88117cff589c4de0cc6073e6322c9961dfc57bfd08f8fe1355bdf1ec6c1021ea" exitCode=0 Apr 22 14:22:05.151818 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:05.151466 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" event={"ID":"76733d07-dc17-4382-9e2a-e8326a5384ee","Type":"ContainerDied","Data":"88117cff589c4de0cc6073e6322c9961dfc57bfd08f8fe1355bdf1ec6c1021ea"} Apr 22 14:22:06.156466 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:06.156430 2562 generic.go:358] "Generic (PLEG): container finished" podID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerID="6260e9ef4c1bb58a098157a68fa8ef325e857faad74a857cdda8e4529b2e9e36" exitCode=0 Apr 22 14:22:06.156466 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:06.156469 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" event={"ID":"76733d07-dc17-4382-9e2a-e8326a5384ee","Type":"ContainerDied","Data":"6260e9ef4c1bb58a098157a68fa8ef325e857faad74a857cdda8e4529b2e9e36"} Apr 22 14:22:07.273516 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.273494 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:07.424615 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.424522 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-util\") pod \"76733d07-dc17-4382-9e2a-e8326a5384ee\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " Apr 22 14:22:07.424615 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.424566 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-bundle\") pod \"76733d07-dc17-4382-9e2a-e8326a5384ee\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " Apr 22 14:22:07.424615 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.424608 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmm7l\" (UniqueName: \"kubernetes.io/projected/76733d07-dc17-4382-9e2a-e8326a5384ee-kube-api-access-zmm7l\") pod \"76733d07-dc17-4382-9e2a-e8326a5384ee\" (UID: \"76733d07-dc17-4382-9e2a-e8326a5384ee\") " Apr 22 14:22:07.425275 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.425248 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-bundle" (OuterVolumeSpecName: "bundle") pod "76733d07-dc17-4382-9e2a-e8326a5384ee" (UID: "76733d07-dc17-4382-9e2a-e8326a5384ee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:22:07.426692 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.426667 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76733d07-dc17-4382-9e2a-e8326a5384ee-kube-api-access-zmm7l" (OuterVolumeSpecName: "kube-api-access-zmm7l") pod "76733d07-dc17-4382-9e2a-e8326a5384ee" (UID: "76733d07-dc17-4382-9e2a-e8326a5384ee"). InnerVolumeSpecName "kube-api-access-zmm7l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:22:07.429299 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.429274 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-util" (OuterVolumeSpecName: "util") pod "76733d07-dc17-4382-9e2a-e8326a5384ee" (UID: "76733d07-dc17-4382-9e2a-e8326a5384ee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:22:07.525698 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.525644 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zmm7l\" (UniqueName: \"kubernetes.io/projected/76733d07-dc17-4382-9e2a-e8326a5384ee-kube-api-access-zmm7l\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:22:07.525698 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.525694 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-util\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:22:07.525698 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:07.525703 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76733d07-dc17-4382-9e2a-e8326a5384ee-bundle\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:22:08.163843 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:08.163805 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" event={"ID":"76733d07-dc17-4382-9e2a-e8326a5384ee","Type":"ContainerDied","Data":"9ff9308c49687e7e5637158d2c11f5085c72c5decc0df6c143df9a561da6b96b"} Apr 22 14:22:08.163843 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:08.163843 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ff9308c49687e7e5637158d2c11f5085c72c5decc0df6c143df9a561da6b96b" Apr 22 14:22:08.163843 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:08.163819 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dvmp5k" Apr 22 14:22:23.569880 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.569803 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb"] Apr 22 14:22:23.570240 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.570093 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerName="pull" Apr 22 14:22:23.570240 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.570105 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerName="pull" Apr 22 14:22:23.570240 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.570114 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerName="util" Apr 22 14:22:23.570240 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.570119 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerName="util" Apr 22 14:22:23.570240 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.570135 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerName="extract" Apr 22 14:22:23.570240 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.570141 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerName="extract" Apr 22 14:22:23.570240 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.570184 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="76733d07-dc17-4382-9e2a-e8326a5384ee" containerName="extract" Apr 22 14:22:23.573087 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.573071 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.576035 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.576015 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 14:22:23.576128 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.576075 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 14:22:23.577249 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.577233 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6bzv\"" Apr 22 14:22:23.581917 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.581897 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb"] Apr 22 14:22:23.745286 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.745233 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8f68\" (UniqueName: \"kubernetes.io/projected/54cbf813-e56a-48c3-b5bf-c01886f25ce5-kube-api-access-w8f68\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.745433 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.745312 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.745433 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.745338 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.846231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.846196 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8f68\" (UniqueName: \"kubernetes.io/projected/54cbf813-e56a-48c3-b5bf-c01886f25ce5-kube-api-access-w8f68\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.846395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.846256 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.846395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.846277 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.846618 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.846602 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.846709 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.846688 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.856008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.855980 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8f68\" (UniqueName: \"kubernetes.io/projected/54cbf813-e56a-48c3-b5bf-c01886f25ce5-kube-api-access-w8f68\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:23.882927 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:23.882898 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:24.006312 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:24.006283 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb"] Apr 22 14:22:24.008035 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:22:24.008006 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54cbf813_e56a_48c3_b5bf_c01886f25ce5.slice/crio-06403c4ae06bb9d6dd6c10f1f63fb12cb89b4be2e2683b5703d625e99b54d161 WatchSource:0}: Error finding container 06403c4ae06bb9d6dd6c10f1f63fb12cb89b4be2e2683b5703d625e99b54d161: Status 404 returned error can't find the container with id 06403c4ae06bb9d6dd6c10f1f63fb12cb89b4be2e2683b5703d625e99b54d161 Apr 22 14:22:24.208950 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:24.208920 2562 generic.go:358] "Generic (PLEG): container finished" podID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerID="f806299896a7acfcad0fb69f322abe12c90930317c43322831f8148d4291b4a8" exitCode=0 Apr 22 14:22:24.209104 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:24.208996 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" event={"ID":"54cbf813-e56a-48c3-b5bf-c01886f25ce5","Type":"ContainerDied","Data":"f806299896a7acfcad0fb69f322abe12c90930317c43322831f8148d4291b4a8"} Apr 22 14:22:24.209104 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:24.209019 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" event={"ID":"54cbf813-e56a-48c3-b5bf-c01886f25ce5","Type":"ContainerStarted","Data":"06403c4ae06bb9d6dd6c10f1f63fb12cb89b4be2e2683b5703d625e99b54d161"} Apr 22 14:22:26.216946 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:26.216915 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" event={"ID":"54cbf813-e56a-48c3-b5bf-c01886f25ce5","Type":"ContainerStarted","Data":"762ae2a29e7a2ac69226eef58bfbff357ccc7dd457483eb5c481ea00723a7b08"} Apr 22 14:22:27.221337 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:27.221304 2562 generic.go:358] "Generic (PLEG): container finished" podID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerID="762ae2a29e7a2ac69226eef58bfbff357ccc7dd457483eb5c481ea00723a7b08" exitCode=0 Apr 22 14:22:27.221824 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:27.221393 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" event={"ID":"54cbf813-e56a-48c3-b5bf-c01886f25ce5","Type":"ContainerDied","Data":"762ae2a29e7a2ac69226eef58bfbff357ccc7dd457483eb5c481ea00723a7b08"} Apr 22 14:22:28.225887 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:28.225846 2562 generic.go:358] "Generic (PLEG): container finished" podID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerID="eec8baf3c944fd3429eaeb05fe2dbff12c76d1c1e5448c785f83d325212fb79b" exitCode=0 Apr 22 14:22:28.226262 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:28.225934 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" event={"ID":"54cbf813-e56a-48c3-b5bf-c01886f25ce5","Type":"ContainerDied","Data":"eec8baf3c944fd3429eaeb05fe2dbff12c76d1c1e5448c785f83d325212fb79b"} Apr 22 14:22:29.344372 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.344350 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:29.494084 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.493993 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-util\") pod \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " Apr 22 14:22:29.494084 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.494052 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8f68\" (UniqueName: \"kubernetes.io/projected/54cbf813-e56a-48c3-b5bf-c01886f25ce5-kube-api-access-w8f68\") pod \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " Apr 22 14:22:29.494084 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.494077 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-bundle\") pod \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\" (UID: \"54cbf813-e56a-48c3-b5bf-c01886f25ce5\") " Apr 22 14:22:29.494468 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.494447 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-bundle" (OuterVolumeSpecName: "bundle") pod "54cbf813-e56a-48c3-b5bf-c01886f25ce5" (UID: "54cbf813-e56a-48c3-b5bf-c01886f25ce5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:22:29.496179 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.496154 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cbf813-e56a-48c3-b5bf-c01886f25ce5-kube-api-access-w8f68" (OuterVolumeSpecName: "kube-api-access-w8f68") pod "54cbf813-e56a-48c3-b5bf-c01886f25ce5" (UID: "54cbf813-e56a-48c3-b5bf-c01886f25ce5"). InnerVolumeSpecName "kube-api-access-w8f68". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:22:29.498779 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.498754 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-util" (OuterVolumeSpecName: "util") pod "54cbf813-e56a-48c3-b5bf-c01886f25ce5" (UID: "54cbf813-e56a-48c3-b5bf-c01886f25ce5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:22:29.594773 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.594720 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-util\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:22:29.594773 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.594769 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8f68\" (UniqueName: \"kubernetes.io/projected/54cbf813-e56a-48c3-b5bf-c01886f25ce5-kube-api-access-w8f68\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:22:29.594773 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:29.594782 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54cbf813-e56a-48c3-b5bf-c01886f25ce5-bundle\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:22:30.233293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:30.233259 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" event={"ID":"54cbf813-e56a-48c3-b5bf-c01886f25ce5","Type":"ContainerDied","Data":"06403c4ae06bb9d6dd6c10f1f63fb12cb89b4be2e2683b5703d625e99b54d161"} Apr 22 14:22:30.233293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:30.233295 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06403c4ae06bb9d6dd6c10f1f63fb12cb89b4be2e2683b5703d625e99b54d161" Apr 22 14:22:30.233490 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:30.233303 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fq9rjb" Apr 22 14:22:49.866786 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.866747 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr"] Apr 22 14:22:49.867196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.866992 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerName="pull" Apr 22 14:22:49.867196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.867003 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerName="pull" Apr 22 14:22:49.867196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.867020 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerName="util" Apr 22 14:22:49.867196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.867025 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerName="util" Apr 22 14:22:49.867196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.867035 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerName="extract" Apr 22 14:22:49.867196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.867041 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerName="extract" Apr 22 14:22:49.867196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.867080 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="54cbf813-e56a-48c3-b5bf-c01886f25ce5" containerName="extract" Apr 22 14:22:49.869885 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.869868 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:49.873715 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.873687 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 14:22:49.874080 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.873704 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 14:22:49.874739 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.874716 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6bzv\"" Apr 22 14:22:49.882042 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.882020 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr"] Apr 22 14:22:49.931999 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.931959 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:49.931999 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.932005 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:49.932266 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:49.932091 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhmjr\" (UniqueName: \"kubernetes.io/projected/af109fa2-ffa4-4849-b65c-5e8882f85b68-kube-api-access-jhmjr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:50.032742 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:50.032708 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:50.032904 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:50.032750 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:50.032904 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:50.032870 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhmjr\" (UniqueName: \"kubernetes.io/projected/af109fa2-ffa4-4849-b65c-5e8882f85b68-kube-api-access-jhmjr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:50.033070 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:50.033052 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:50.033137 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:50.033118 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:50.044237 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:50.044208 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhmjr\" (UniqueName: \"kubernetes.io/projected/af109fa2-ffa4-4849-b65c-5e8882f85b68-kube-api-access-jhmjr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:50.181258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:50.181159 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:50.300067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:50.300044 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr"] Apr 22 14:22:50.302135 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:22:50.302109 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf109fa2_ffa4_4849_b65c_5e8882f85b68.slice/crio-1c66eba39934d4225c1602be38344b916e65786d39fe2b0c761016fed1cd4a35 WatchSource:0}: Error finding container 1c66eba39934d4225c1602be38344b916e65786d39fe2b0c761016fed1cd4a35: Status 404 returned error can't find the container with id 1c66eba39934d4225c1602be38344b916e65786d39fe2b0c761016fed1cd4a35 Apr 22 14:22:51.290593 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:51.290557 2562 generic.go:358] "Generic (PLEG): container finished" podID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerID="0212bb19ba839a07499ed4ddfaf1227d2707013f7daedac895a074a33a19c97f" exitCode=0 Apr 22 14:22:51.290962 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:51.290642 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" event={"ID":"af109fa2-ffa4-4849-b65c-5e8882f85b68","Type":"ContainerDied","Data":"0212bb19ba839a07499ed4ddfaf1227d2707013f7daedac895a074a33a19c97f"} Apr 22 14:22:51.290962 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:51.290690 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" event={"ID":"af109fa2-ffa4-4849-b65c-5e8882f85b68","Type":"ContainerStarted","Data":"1c66eba39934d4225c1602be38344b916e65786d39fe2b0c761016fed1cd4a35"} Apr 22 14:22:52.295918 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:52.295825 2562 generic.go:358] "Generic (PLEG): container finished" podID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerID="930f7c1b8257f4f1f5b4f2330504f2ce9c0db3596abdf7ff72689c41935d33db" exitCode=0 Apr 22 14:22:52.296264 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:52.295929 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" event={"ID":"af109fa2-ffa4-4849-b65c-5e8882f85b68","Type":"ContainerDied","Data":"930f7c1b8257f4f1f5b4f2330504f2ce9c0db3596abdf7ff72689c41935d33db"} Apr 22 14:22:53.300024 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:53.299985 2562 generic.go:358] "Generic (PLEG): container finished" podID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerID="e334227cf86517f8b24f1f9006b66e922b94476832ae8059597c0f8638c3ef6b" exitCode=0 Apr 22 14:22:53.300024 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:53.300027 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" event={"ID":"af109fa2-ffa4-4849-b65c-5e8882f85b68","Type":"ContainerDied","Data":"e334227cf86517f8b24f1f9006b66e922b94476832ae8059597c0f8638c3ef6b"} Apr 22 14:22:54.422534 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.422512 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:22:54.469104 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.469075 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhmjr\" (UniqueName: \"kubernetes.io/projected/af109fa2-ffa4-4849-b65c-5e8882f85b68-kube-api-access-jhmjr\") pod \"af109fa2-ffa4-4849-b65c-5e8882f85b68\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " Apr 22 14:22:54.469263 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.469127 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-bundle\") pod \"af109fa2-ffa4-4849-b65c-5e8882f85b68\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " Apr 22 14:22:54.469263 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.469153 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-util\") pod \"af109fa2-ffa4-4849-b65c-5e8882f85b68\" (UID: \"af109fa2-ffa4-4849-b65c-5e8882f85b68\") " Apr 22 14:22:54.469994 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.469964 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-bundle" (OuterVolumeSpecName: "bundle") pod "af109fa2-ffa4-4849-b65c-5e8882f85b68" (UID: "af109fa2-ffa4-4849-b65c-5e8882f85b68"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:22:54.471243 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.471206 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af109fa2-ffa4-4849-b65c-5e8882f85b68-kube-api-access-jhmjr" (OuterVolumeSpecName: "kube-api-access-jhmjr") pod "af109fa2-ffa4-4849-b65c-5e8882f85b68" (UID: "af109fa2-ffa4-4849-b65c-5e8882f85b68"). InnerVolumeSpecName "kube-api-access-jhmjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:22:54.474899 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.474873 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-util" (OuterVolumeSpecName: "util") pod "af109fa2-ffa4-4849-b65c-5e8882f85b68" (UID: "af109fa2-ffa4-4849-b65c-5e8882f85b68"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:22:54.570425 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.570345 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-util\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:22:54.570425 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.570371 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhmjr\" (UniqueName: \"kubernetes.io/projected/af109fa2-ffa4-4849-b65c-5e8882f85b68-kube-api-access-jhmjr\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:22:54.570425 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:54.570381 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af109fa2-ffa4-4849-b65c-5e8882f85b68-bundle\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:22:55.307395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:55.307311 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" event={"ID":"af109fa2-ffa4-4849-b65c-5e8882f85b68","Type":"ContainerDied","Data":"1c66eba39934d4225c1602be38344b916e65786d39fe2b0c761016fed1cd4a35"} Apr 22 14:22:55.307395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:55.307346 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c66eba39934d4225c1602be38344b916e65786d39fe2b0c761016fed1cd4a35" Apr 22 14:22:55.307395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:22:55.307391 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483596vzr" Apr 22 14:23:05.494444 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.494407 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs"] Apr 22 14:23:05.494915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.494697 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerName="pull" Apr 22 14:23:05.494915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.494711 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerName="pull" Apr 22 14:23:05.494915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.494731 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerName="extract" Apr 22 14:23:05.494915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.494736 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerName="extract" Apr 22 14:23:05.494915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.494744 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerName="util" Apr 22 14:23:05.494915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.494752 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerName="util" Apr 22 14:23:05.494915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.494801 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="af109fa2-ffa4-4849-b65c-5e8882f85b68" containerName="extract" Apr 22 14:23:05.498509 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.498486 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.504919 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.504891 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 14:23:05.505052 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.504940 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6bzv\"" Apr 22 14:23:05.506232 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.506215 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 14:23:05.522683 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.522638 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs"] Apr 22 14:23:05.550600 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.550566 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.550781 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.550616 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.550781 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.550747 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txlmf\" (UniqueName: \"kubernetes.io/projected/34d86dd1-fc97-40e9-9f4e-a43c41428b29-kube-api-access-txlmf\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.651333 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.651288 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txlmf\" (UniqueName: \"kubernetes.io/projected/34d86dd1-fc97-40e9-9f4e-a43c41428b29-kube-api-access-txlmf\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.651541 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.651369 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.651541 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.651410 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.651805 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.651783 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.651881 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.651820 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.675184 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.675156 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txlmf\" (UniqueName: \"kubernetes.io/projected/34d86dd1-fc97-40e9-9f4e-a43c41428b29-kube-api-access-txlmf\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:05.807640 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:05.807533 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:06.141244 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:06.141220 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs"] Apr 22 14:23:06.143843 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:06.143810 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34d86dd1_fc97_40e9_9f4e_a43c41428b29.slice/crio-2f3e4aa04569535b4772c75dcc0f3206882924b515c3b8240c0e64d11745f198 WatchSource:0}: Error finding container 2f3e4aa04569535b4772c75dcc0f3206882924b515c3b8240c0e64d11745f198: Status 404 returned error can't find the container with id 2f3e4aa04569535b4772c75dcc0f3206882924b515c3b8240c0e64d11745f198 Apr 22 14:23:06.342220 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:06.342170 2562 generic.go:358] "Generic (PLEG): container finished" podID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerID="851ae76b69467472bff556571fa74ed48b6abc257d305b3ce93fbdc2f44f3632" exitCode=0 Apr 22 14:23:06.342404 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:06.342256 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" event={"ID":"34d86dd1-fc97-40e9-9f4e-a43c41428b29","Type":"ContainerDied","Data":"851ae76b69467472bff556571fa74ed48b6abc257d305b3ce93fbdc2f44f3632"} Apr 22 14:23:06.342404 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:06.342294 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" event={"ID":"34d86dd1-fc97-40e9-9f4e-a43c41428b29","Type":"ContainerStarted","Data":"2f3e4aa04569535b4772c75dcc0f3206882924b515c3b8240c0e64d11745f198"} Apr 22 14:23:08.349813 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:08.349778 2562 generic.go:358] "Generic (PLEG): container finished" podID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerID="a2f508a6285a09a590c36fd3f8646f40ef39c293252bf15c3f97c427411fbcef" exitCode=0 Apr 22 14:23:08.350203 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:08.349855 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" event={"ID":"34d86dd1-fc97-40e9-9f4e-a43c41428b29","Type":"ContainerDied","Data":"a2f508a6285a09a590c36fd3f8646f40ef39c293252bf15c3f97c427411fbcef"} Apr 22 14:23:09.126317 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.124971 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv"] Apr 22 14:23:09.128559 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.128535 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:09.139344 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.139156 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-l4l2m\"" Apr 22 14:23:09.140098 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.139922 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 14:23:09.148790 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.148766 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 14:23:09.181530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.181506 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdgpd\" (UniqueName: \"kubernetes.io/projected/6de8a6c8-14a2-4926-92d5-db91c8df3896-kube-api-access-xdgpd\") pod \"servicemesh-operator3-55f49c5f94-7zdwv\" (UID: \"6de8a6c8-14a2-4926-92d5-db91c8df3896\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:09.181614 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.181576 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6de8a6c8-14a2-4926-92d5-db91c8df3896-operator-config\") pod \"servicemesh-operator3-55f49c5f94-7zdwv\" (UID: \"6de8a6c8-14a2-4926-92d5-db91c8df3896\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:09.190351 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.190327 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv"] Apr 22 14:23:09.282219 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.282189 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdgpd\" (UniqueName: \"kubernetes.io/projected/6de8a6c8-14a2-4926-92d5-db91c8df3896-kube-api-access-xdgpd\") pod \"servicemesh-operator3-55f49c5f94-7zdwv\" (UID: \"6de8a6c8-14a2-4926-92d5-db91c8df3896\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:09.282396 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.282269 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6de8a6c8-14a2-4926-92d5-db91c8df3896-operator-config\") pod \"servicemesh-operator3-55f49c5f94-7zdwv\" (UID: \"6de8a6c8-14a2-4926-92d5-db91c8df3896\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:09.284733 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.284713 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6de8a6c8-14a2-4926-92d5-db91c8df3896-operator-config\") pod \"servicemesh-operator3-55f49c5f94-7zdwv\" (UID: \"6de8a6c8-14a2-4926-92d5-db91c8df3896\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:09.302789 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.302758 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdgpd\" (UniqueName: \"kubernetes.io/projected/6de8a6c8-14a2-4926-92d5-db91c8df3896-kube-api-access-xdgpd\") pod \"servicemesh-operator3-55f49c5f94-7zdwv\" (UID: \"6de8a6c8-14a2-4926-92d5-db91c8df3896\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:09.354216 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.354183 2562 generic.go:358] "Generic (PLEG): container finished" podID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerID="a5de11fbd8d6cbfad36344e4e74297550cc4c92cf25e7a00410dd59fef724a8b" exitCode=0 Apr 22 14:23:09.354573 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.354224 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" event={"ID":"34d86dd1-fc97-40e9-9f4e-a43c41428b29","Type":"ContainerDied","Data":"a5de11fbd8d6cbfad36344e4e74297550cc4c92cf25e7a00410dd59fef724a8b"} Apr 22 14:23:09.442413 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.442325 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:09.599940 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:09.599908 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv"] Apr 22 14:23:09.603499 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:09.603473 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6de8a6c8_14a2_4926_92d5_db91c8df3896.slice/crio-8d998cc6efef35c99001fa754fe12dd60340232a2a3a3a53f7a270b54023f341 WatchSource:0}: Error finding container 8d998cc6efef35c99001fa754fe12dd60340232a2a3a3a53f7a270b54023f341: Status 404 returned error can't find the container with id 8d998cc6efef35c99001fa754fe12dd60340232a2a3a3a53f7a270b54023f341 Apr 22 14:23:10.357994 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.357959 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" event={"ID":"6de8a6c8-14a2-4926-92d5-db91c8df3896","Type":"ContainerStarted","Data":"8d998cc6efef35c99001fa754fe12dd60340232a2a3a3a53f7a270b54023f341"} Apr 22 14:23:10.498002 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.497973 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:10.593842 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.593810 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-bundle\") pod \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " Apr 22 14:23:10.593842 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.593853 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-util\") pod \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " Apr 22 14:23:10.594069 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.593917 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txlmf\" (UniqueName: \"kubernetes.io/projected/34d86dd1-fc97-40e9-9f4e-a43c41428b29-kube-api-access-txlmf\") pod \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\" (UID: \"34d86dd1-fc97-40e9-9f4e-a43c41428b29\") " Apr 22 14:23:10.595120 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.595083 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-bundle" (OuterVolumeSpecName: "bundle") pod "34d86dd1-fc97-40e9-9f4e-a43c41428b29" (UID: "34d86dd1-fc97-40e9-9f4e-a43c41428b29"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:10.596511 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.596482 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d86dd1-fc97-40e9-9f4e-a43c41428b29-kube-api-access-txlmf" (OuterVolumeSpecName: "kube-api-access-txlmf") pod "34d86dd1-fc97-40e9-9f4e-a43c41428b29" (UID: "34d86dd1-fc97-40e9-9f4e-a43c41428b29"). InnerVolumeSpecName "kube-api-access-txlmf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:10.603533 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.603501 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-util" (OuterVolumeSpecName: "util") pod "34d86dd1-fc97-40e9-9f4e-a43c41428b29" (UID: "34d86dd1-fc97-40e9-9f4e-a43c41428b29"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:10.695592 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.695475 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-txlmf\" (UniqueName: \"kubernetes.io/projected/34d86dd1-fc97-40e9-9f4e-a43c41428b29-kube-api-access-txlmf\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:10.695592 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.695549 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-bundle\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:10.695592 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:10.695564 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34d86dd1-fc97-40e9-9f4e-a43c41428b29-util\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:11.362489 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:11.362453 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" event={"ID":"34d86dd1-fc97-40e9-9f4e-a43c41428b29","Type":"ContainerDied","Data":"2f3e4aa04569535b4772c75dcc0f3206882924b515c3b8240c0e64d11745f198"} Apr 22 14:23:11.362489 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:11.362491 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f3e4aa04569535b4772c75dcc0f3206882924b515c3b8240c0e64d11745f198" Apr 22 14:23:11.363067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:11.362544 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebr4rjs" Apr 22 14:23:13.370176 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:13.370139 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" event={"ID":"6de8a6c8-14a2-4926-92d5-db91c8df3896","Type":"ContainerStarted","Data":"a9e144ccfb68d375e76b271c04fae2eae93ddf87be23b4e1482738175ada9355"} Apr 22 14:23:13.370558 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:13.370219 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:13.394457 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:13.394404 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" podStartSLOduration=1.6698111820000001 podStartE2EDuration="4.394390389s" podCreationTimestamp="2026-04-22 14:23:09 +0000 UTC" firstStartedPulling="2026-04-22 14:23:09.606107493 +0000 UTC m=+493.305468171" lastFinishedPulling="2026-04-22 14:23:12.330686717 +0000 UTC m=+496.030047378" observedRunningTime="2026-04-22 14:23:13.392861869 +0000 UTC m=+497.092222552" watchObservedRunningTime="2026-04-22 14:23:13.394390389 +0000 UTC m=+497.093751072" Apr 22 14:23:18.514469 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.514425 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj"] Apr 22 14:23:18.514996 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.514822 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerName="util" Apr 22 14:23:18.514996 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.514840 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerName="util" Apr 22 14:23:18.514996 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.514859 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerName="extract" Apr 22 14:23:18.514996 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.514868 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerName="extract" Apr 22 14:23:18.514996 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.514879 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerName="pull" Apr 22 14:23:18.514996 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.514887 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerName="pull" Apr 22 14:23:18.514996 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.514967 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="34d86dd1-fc97-40e9-9f4e-a43c41428b29" containerName="extract" Apr 22 14:23:18.517866 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.517844 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.520714 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.520686 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 14:23:18.520848 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.520694 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 14:23:18.520848 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.520814 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-rkms6\"" Apr 22 14:23:18.520848 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.520841 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 14:23:18.520996 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.520845 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 14:23:18.521885 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.521871 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 14:23:18.521933 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.521892 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 14:23:18.530125 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.530099 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj"] Apr 22 14:23:18.558343 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.558299 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.558538 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.558365 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7z8\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-kube-api-access-jb7z8\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.558538 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.558411 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.558538 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.558476 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1a032912-f35e-4dcc-a2a7-fed9e85e297a-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.558538 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.558518 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.558793 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.558590 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.558793 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.558699 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.659387 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.659347 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1a032912-f35e-4dcc-a2a7-fed9e85e297a-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.659587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.659393 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.659587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.659428 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.659587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.659455 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.659587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.659487 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.659587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.659521 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7z8\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-kube-api-access-jb7z8\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.659587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.659568 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.660260 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.660178 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.661976 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.661953 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.662197 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.662173 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1a032912-f35e-4dcc-a2a7-fed9e85e297a-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.662306 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.662292 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.662452 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.662428 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.668569 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.668546 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.669811 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.669792 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7z8\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-kube-api-access-jb7z8\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8rhj\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.827368 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.827336 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:18.968049 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:18.968021 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj"] Apr 22 14:23:18.969181 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:18.969155 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a032912_f35e_4dcc_a2a7_fed9e85e297a.slice/crio-840931c7772e3cdb1aeac4a965cc5d64cc361c1847a6226a7e53513ddbcc862f WatchSource:0}: Error finding container 840931c7772e3cdb1aeac4a965cc5d64cc361c1847a6226a7e53513ddbcc862f: Status 404 returned error can't find the container with id 840931c7772e3cdb1aeac4a965cc5d64cc361c1847a6226a7e53513ddbcc862f Apr 22 14:23:19.391338 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:19.391279 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" event={"ID":"1a032912-f35e-4dcc-a2a7-fed9e85e297a","Type":"ContainerStarted","Data":"840931c7772e3cdb1aeac4a965cc5d64cc361c1847a6226a7e53513ddbcc862f"} Apr 22 14:23:21.239333 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:21.239289 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:23:21.239694 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:21.239361 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:23:21.399631 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:21.399585 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" event={"ID":"1a032912-f35e-4dcc-a2a7-fed9e85e297a","Type":"ContainerStarted","Data":"9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2"} Apr 22 14:23:21.400459 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:21.400429 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:21.423391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:21.423318 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" podStartSLOduration=1.155315134 podStartE2EDuration="3.423298025s" podCreationTimestamp="2026-04-22 14:23:18 +0000 UTC" firstStartedPulling="2026-04-22 14:23:18.971089004 +0000 UTC m=+502.670449664" lastFinishedPulling="2026-04-22 14:23:21.239071893 +0000 UTC m=+504.938432555" observedRunningTime="2026-04-22 14:23:21.422077344 +0000 UTC m=+505.121438026" watchObservedRunningTime="2026-04-22 14:23:21.423298025 +0000 UTC m=+505.122658708" Apr 22 14:23:22.408003 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:22.407957 2562 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-l8rhj container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 14:23:22.408441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:22.408023 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" podUID="1a032912-f35e-4dcc-a2a7-fed9e85e297a" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:23:23.410055 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:23.410024 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:23:24.376734 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:24.376706 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7zdwv" Apr 22 14:23:25.407079 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.407042 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482"] Apr 22 14:23:25.410839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.410817 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.414992 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.414759 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-rfk7w\"" Apr 22 14:23:25.425835 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.425805 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482"] Apr 22 14:23:25.522311 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.522271 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.522311 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.522315 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/89666a12-f26d-461b-a935-b30133ba67c1-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.522524 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.522390 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/89666a12-f26d-461b-a935-b30133ba67c1-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.522524 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.522468 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89666a12-f26d-461b-a935-b30133ba67c1-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.522524 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.522510 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.522621 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.522534 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.522621 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.522594 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.522709 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.522636 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5rhc\" (UniqueName: \"kubernetes.io/projected/89666a12-f26d-461b-a935-b30133ba67c1-kube-api-access-q5rhc\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.522709 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.522691 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623263 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623227 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/89666a12-f26d-461b-a935-b30133ba67c1-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623285 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89666a12-f26d-461b-a935-b30133ba67c1-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623326 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623359 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623594 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623526 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623594 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623582 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5rhc\" (UniqueName: \"kubernetes.io/projected/89666a12-f26d-461b-a935-b30133ba67c1-kube-api-access-q5rhc\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623792 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623630 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623792 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623687 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623792 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623735 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/89666a12-f26d-461b-a935-b30133ba67c1-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.623792 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.623784 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.624054 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.624029 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.624054 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.624054 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/89666a12-f26d-461b-a935-b30133ba67c1-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.624212 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.624055 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.624212 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.624149 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.626047 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.626026 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/89666a12-f26d-461b-a935-b30133ba67c1-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.626178 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.626160 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/89666a12-f26d-461b-a935-b30133ba67c1-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.640390 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.640360 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5rhc\" (UniqueName: \"kubernetes.io/projected/89666a12-f26d-461b-a935-b30133ba67c1-kube-api-access-q5rhc\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.663814 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.663739 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89666a12-f26d-461b-a935-b30133ba67c1-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-vs482\" (UID: \"89666a12-f26d-461b-a935-b30133ba67c1\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.722083 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.722048 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:25.858596 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:25.858567 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89666a12_f26d_461b_a935_b30133ba67c1.slice/crio-d53fbd82b86402a44627a398d310bbdc2a2bcd5bf2bc19779365206a0a71603a WatchSource:0}: Error finding container d53fbd82b86402a44627a398d310bbdc2a2bcd5bf2bc19779365206a0a71603a: Status 404 returned error can't find the container with id d53fbd82b86402a44627a398d310bbdc2a2bcd5bf2bc19779365206a0a71603a Apr 22 14:23:25.859951 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:25.859928 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482"] Apr 22 14:23:26.421815 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:26.421774 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" event={"ID":"89666a12-f26d-461b-a935-b30133ba67c1","Type":"ContainerStarted","Data":"d53fbd82b86402a44627a398d310bbdc2a2bcd5bf2bc19779365206a0a71603a"} Apr 22 14:23:28.398840 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:28.398791 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:23:28.399199 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:28.398897 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:23:28.399199 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:28.398948 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:23:29.433495 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:29.433462 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" event={"ID":"89666a12-f26d-461b-a935-b30133ba67c1","Type":"ContainerStarted","Data":"f36d2a4e51b37ff691f244413073275543ffa663605b1dbc0b2b5f9f15ab1cf8"} Apr 22 14:23:29.464549 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:29.464495 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" podStartSLOduration=1.926820481 podStartE2EDuration="4.464476844s" podCreationTimestamp="2026-04-22 14:23:25 +0000 UTC" firstStartedPulling="2026-04-22 14:23:25.860881146 +0000 UTC m=+509.560241806" lastFinishedPulling="2026-04-22 14:23:28.398537506 +0000 UTC m=+512.097898169" observedRunningTime="2026-04-22 14:23:29.461610481 +0000 UTC m=+513.160971162" watchObservedRunningTime="2026-04-22 14:23:29.464476844 +0000 UTC m=+513.163837526" Apr 22 14:23:29.722858 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:29.722762 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:29.727389 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:29.727362 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:30.437149 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:30.437119 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:30.438195 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:30.438176 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-vs482" Apr 22 14:23:35.911888 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:35.911853 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44"] Apr 22 14:23:35.915258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:35.915241 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:35.922836 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:35.922808 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 14:23:35.922965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:35.922854 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6bzv\"" Apr 22 14:23:35.922965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:35.922855 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 14:23:35.932974 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:35.932950 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44"] Apr 22 14:23:36.010760 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.010724 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.010760 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.010766 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.010971 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.010947 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8cl2\" (UniqueName: \"kubernetes.io/projected/b396ee1d-9125-4380-8108-88297a483b24-kube-api-access-h8cl2\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.016992 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.016963 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx"] Apr 22 14:23:36.020200 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.020182 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.033060 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.033025 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx"] Apr 22 14:23:36.097742 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.097704 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh"] Apr 22 14:23:36.101052 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.101036 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.110286 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.110251 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh"] Apr 22 14:23:36.111795 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.111771 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.111921 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.111825 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.111986 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.111909 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.111986 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.111966 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.112095 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.112007 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxpzq\" (UniqueName: \"kubernetes.io/projected/99dc2b8a-b3f3-4614-acfc-48949528d76a-kube-api-access-cxpzq\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.112147 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.112111 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8cl2\" (UniqueName: \"kubernetes.io/projected/b396ee1d-9125-4380-8108-88297a483b24-kube-api-access-h8cl2\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.112197 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.112182 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.112246 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.112195 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.121099 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.121071 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8cl2\" (UniqueName: \"kubernetes.io/projected/b396ee1d-9125-4380-8108-88297a483b24-kube-api-access-h8cl2\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.187281 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.187199 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7"] Apr 22 14:23:36.190658 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.190626 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.205529 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.205499 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7"] Apr 22 14:23:36.213338 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.213315 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.213454 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.213348 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxpzq\" (UniqueName: \"kubernetes.io/projected/99dc2b8a-b3f3-4614-acfc-48949528d76a-kube-api-access-cxpzq\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.213454 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.213383 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6spbd\" (UniqueName: \"kubernetes.io/projected/6d48598e-75f2-425d-b2b3-adadae545d98-kube-api-access-6spbd\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.213454 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.213443 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.213582 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.213475 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.213582 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.213507 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.213796 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.213776 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.213837 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.213800 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.222565 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.222538 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxpzq\" (UniqueName: \"kubernetes.io/projected/99dc2b8a-b3f3-4614-acfc-48949528d76a-kube-api-access-cxpzq\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.224385 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.224367 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:36.314378 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.314328 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.314554 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.314393 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.314554 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.314458 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.314554 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.314490 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.314554 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.314527 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz68j\" (UniqueName: \"kubernetes.io/projected/fd80229c-7d84-4846-8c33-eb679f2366dd-kube-api-access-vz68j\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.314815 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.314569 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6spbd\" (UniqueName: \"kubernetes.io/projected/6d48598e-75f2-425d-b2b3-adadae545d98-kube-api-access-6spbd\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.314872 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.314832 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.314981 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.314956 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.326057 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.326029 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6spbd\" (UniqueName: \"kubernetes.io/projected/6d48598e-75f2-425d-b2b3-adadae545d98-kube-api-access-6spbd\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.328850 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.328819 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:36.343357 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.343336 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44"] Apr 22 14:23:36.345553 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:36.345529 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb396ee1d_9125_4380_8108_88297a483b24.slice/crio-87ed935443216fe47ee67e042179b7728c13fb7f1aa2a9ad593c1760d180c05e WatchSource:0}: Error finding container 87ed935443216fe47ee67e042179b7728c13fb7f1aa2a9ad593c1760d180c05e: Status 404 returned error can't find the container with id 87ed935443216fe47ee67e042179b7728c13fb7f1aa2a9ad593c1760d180c05e Apr 22 14:23:36.410974 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.410945 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:36.415543 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.415502 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.415543 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.415544 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.415783 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.415576 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz68j\" (UniqueName: \"kubernetes.io/projected/fd80229c-7d84-4846-8c33-eb679f2366dd-kube-api-access-vz68j\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.416241 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.416217 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.416325 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.416266 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.433026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.432683 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz68j\" (UniqueName: \"kubernetes.io/projected/fd80229c-7d84-4846-8c33-eb679f2366dd-kube-api-access-vz68j\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.458291 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.458209 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" event={"ID":"b396ee1d-9125-4380-8108-88297a483b24","Type":"ContainerStarted","Data":"356fa3e5c15110da6cb12aedbb17dff3819dd269cf76d05def0aeeb5e010136a"} Apr 22 14:23:36.458291 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.458254 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" event={"ID":"b396ee1d-9125-4380-8108-88297a483b24","Type":"ContainerStarted","Data":"87ed935443216fe47ee67e042179b7728c13fb7f1aa2a9ad593c1760d180c05e"} Apr 22 14:23:36.470817 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.470787 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx"] Apr 22 14:23:36.471447 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:36.471419 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99dc2b8a_b3f3_4614_acfc_48949528d76a.slice/crio-751db56d23923acec8ca12acc8c34c0fd62f0321580f23ddf803ec46f06c3399 WatchSource:0}: Error finding container 751db56d23923acec8ca12acc8c34c0fd62f0321580f23ddf803ec46f06c3399: Status 404 returned error can't find the container with id 751db56d23923acec8ca12acc8c34c0fd62f0321580f23ddf803ec46f06c3399 Apr 22 14:23:36.498898 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.498870 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:36.545478 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.545448 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh"] Apr 22 14:23:36.551431 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:36.551374 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d48598e_75f2_425d_b2b3_adadae545d98.slice/crio-e168055298d753ae58947a6479e2635ffad805d1c6dab91e8bccfe60d7ffe85d WatchSource:0}: Error finding container e168055298d753ae58947a6479e2635ffad805d1c6dab91e8bccfe60d7ffe85d: Status 404 returned error can't find the container with id e168055298d753ae58947a6479e2635ffad805d1c6dab91e8bccfe60d7ffe85d Apr 22 14:23:36.638086 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:36.638061 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7"] Apr 22 14:23:36.671375 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:36.671346 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd80229c_7d84_4846_8c33_eb679f2366dd.slice/crio-8d1f242bdf74c7a3cb42f6dc258caed968805af1d0f34f7e6136cb438cc86eea WatchSource:0}: Error finding container 8d1f242bdf74c7a3cb42f6dc258caed968805af1d0f34f7e6136cb438cc86eea: Status 404 returned error can't find the container with id 8d1f242bdf74c7a3cb42f6dc258caed968805af1d0f34f7e6136cb438cc86eea Apr 22 14:23:37.462944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.462908 2562 generic.go:358] "Generic (PLEG): container finished" podID="6d48598e-75f2-425d-b2b3-adadae545d98" containerID="d5389f898291d950280e85bdb1006b39bbcb1ca5cc7f75ea9a9f822ada628b2a" exitCode=0 Apr 22 14:23:37.463459 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.462975 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" event={"ID":"6d48598e-75f2-425d-b2b3-adadae545d98","Type":"ContainerDied","Data":"d5389f898291d950280e85bdb1006b39bbcb1ca5cc7f75ea9a9f822ada628b2a"} Apr 22 14:23:37.463459 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.462999 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" event={"ID":"6d48598e-75f2-425d-b2b3-adadae545d98","Type":"ContainerStarted","Data":"e168055298d753ae58947a6479e2635ffad805d1c6dab91e8bccfe60d7ffe85d"} Apr 22 14:23:37.464441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.464325 2562 generic.go:358] "Generic (PLEG): container finished" podID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerID="139b73b511279d070edf0d1137c7a072479aef8d2a7badf44e3055d65e5e1d98" exitCode=0 Apr 22 14:23:37.464441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.464399 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" event={"ID":"99dc2b8a-b3f3-4614-acfc-48949528d76a","Type":"ContainerDied","Data":"139b73b511279d070edf0d1137c7a072479aef8d2a7badf44e3055d65e5e1d98"} Apr 22 14:23:37.464441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.464432 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" event={"ID":"99dc2b8a-b3f3-4614-acfc-48949528d76a","Type":"ContainerStarted","Data":"751db56d23923acec8ca12acc8c34c0fd62f0321580f23ddf803ec46f06c3399"} Apr 22 14:23:37.465887 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.465869 2562 generic.go:358] "Generic (PLEG): container finished" podID="b396ee1d-9125-4380-8108-88297a483b24" containerID="356fa3e5c15110da6cb12aedbb17dff3819dd269cf76d05def0aeeb5e010136a" exitCode=0 Apr 22 14:23:37.465965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.465948 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" event={"ID":"b396ee1d-9125-4380-8108-88297a483b24","Type":"ContainerDied","Data":"356fa3e5c15110da6cb12aedbb17dff3819dd269cf76d05def0aeeb5e010136a"} Apr 22 14:23:37.467663 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.467627 2562 generic.go:358] "Generic (PLEG): container finished" podID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerID="346d1841eb27bd7052bd90b5804c2cf639b5b68028473c72916ce64eb85a16d6" exitCode=0 Apr 22 14:23:37.467756 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.467671 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" event={"ID":"fd80229c-7d84-4846-8c33-eb679f2366dd","Type":"ContainerDied","Data":"346d1841eb27bd7052bd90b5804c2cf639b5b68028473c72916ce64eb85a16d6"} Apr 22 14:23:37.467756 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:37.467700 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" event={"ID":"fd80229c-7d84-4846-8c33-eb679f2366dd","Type":"ContainerStarted","Data":"8d1f242bdf74c7a3cb42f6dc258caed968805af1d0f34f7e6136cb438cc86eea"} Apr 22 14:23:38.472449 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:38.472419 2562 generic.go:358] "Generic (PLEG): container finished" podID="b396ee1d-9125-4380-8108-88297a483b24" containerID="e2d808e5e6c1bc54d296e77867d80415e5612eed30ce44bd8aa2e5cd392176b7" exitCode=0 Apr 22 14:23:38.472819 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:38.472498 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" event={"ID":"b396ee1d-9125-4380-8108-88297a483b24","Type":"ContainerDied","Data":"e2d808e5e6c1bc54d296e77867d80415e5612eed30ce44bd8aa2e5cd392176b7"} Apr 22 14:23:38.474111 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:38.474090 2562 generic.go:358] "Generic (PLEG): container finished" podID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerID="d7d8b63ebbdaf3027ad7b067723637a2c4b4d487fa1468cfdf3ec70260ba78f8" exitCode=0 Apr 22 14:23:38.474258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:38.474187 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" event={"ID":"fd80229c-7d84-4846-8c33-eb679f2366dd","Type":"ContainerDied","Data":"d7d8b63ebbdaf3027ad7b067723637a2c4b4d487fa1468cfdf3ec70260ba78f8"} Apr 22 14:23:38.475874 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:38.475851 2562 generic.go:358] "Generic (PLEG): container finished" podID="6d48598e-75f2-425d-b2b3-adadae545d98" containerID="ec428be5bf5dbfa5e0f8cc0925d7a7126578d885e001f15a5362d4707163091e" exitCode=0 Apr 22 14:23:38.475968 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:38.475923 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" event={"ID":"6d48598e-75f2-425d-b2b3-adadae545d98","Type":"ContainerDied","Data":"ec428be5bf5dbfa5e0f8cc0925d7a7126578d885e001f15a5362d4707163091e"} Apr 22 14:23:38.477610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:38.477588 2562 generic.go:358] "Generic (PLEG): container finished" podID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerID="159d37c3e7d5b3bc2422f15f2843931c19692c82a879fffc997caf818f00d4ec" exitCode=0 Apr 22 14:23:38.477711 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:38.477625 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" event={"ID":"99dc2b8a-b3f3-4614-acfc-48949528d76a","Type":"ContainerDied","Data":"159d37c3e7d5b3bc2422f15f2843931c19692c82a879fffc997caf818f00d4ec"} Apr 22 14:23:39.483582 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:39.483546 2562 generic.go:358] "Generic (PLEG): container finished" podID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerID="c8690e366a5cfe49d33d40d4f746c79838346be2c375e5ee3a1fda12dfade591" exitCode=0 Apr 22 14:23:39.483994 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:39.483623 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" event={"ID":"fd80229c-7d84-4846-8c33-eb679f2366dd","Type":"ContainerDied","Data":"c8690e366a5cfe49d33d40d4f746c79838346be2c375e5ee3a1fda12dfade591"} Apr 22 14:23:39.485489 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:39.485463 2562 generic.go:358] "Generic (PLEG): container finished" podID="6d48598e-75f2-425d-b2b3-adadae545d98" containerID="6d17486151697b42ce463a34807a9cf18f1c59af304292c373ce9c33fc2117b4" exitCode=0 Apr 22 14:23:39.485634 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:39.485521 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" event={"ID":"6d48598e-75f2-425d-b2b3-adadae545d98","Type":"ContainerDied","Data":"6d17486151697b42ce463a34807a9cf18f1c59af304292c373ce9c33fc2117b4"} Apr 22 14:23:39.487275 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:39.487253 2562 generic.go:358] "Generic (PLEG): container finished" podID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerID="9d184b679da1fbdbd86a7d44ac0efac97ca5cddc84b250a7a531edd9326d3dec" exitCode=0 Apr 22 14:23:39.487396 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:39.487330 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" event={"ID":"99dc2b8a-b3f3-4614-acfc-48949528d76a","Type":"ContainerDied","Data":"9d184b679da1fbdbd86a7d44ac0efac97ca5cddc84b250a7a531edd9326d3dec"} Apr 22 14:23:39.489258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:39.489230 2562 generic.go:358] "Generic (PLEG): container finished" podID="b396ee1d-9125-4380-8108-88297a483b24" containerID="d022498d396daa69a76635d22277b476390d8064e963590f3ccca63791d5db9c" exitCode=0 Apr 22 14:23:39.489353 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:39.489266 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" event={"ID":"b396ee1d-9125-4380-8108-88297a483b24","Type":"ContainerDied","Data":"d022498d396daa69a76635d22277b476390d8064e963590f3ccca63791d5db9c"} Apr 22 14:23:40.669829 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.669803 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:40.701695 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.701643 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:40.704999 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.704978 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:40.708948 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.708923 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:40.755487 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.755407 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-util\") pod \"6d48598e-75f2-425d-b2b3-adadae545d98\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " Apr 22 14:23:40.755663 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.755490 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6spbd\" (UniqueName: \"kubernetes.io/projected/6d48598e-75f2-425d-b2b3-adadae545d98-kube-api-access-6spbd\") pod \"6d48598e-75f2-425d-b2b3-adadae545d98\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " Apr 22 14:23:40.755663 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.755549 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-bundle\") pod \"6d48598e-75f2-425d-b2b3-adadae545d98\" (UID: \"6d48598e-75f2-425d-b2b3-adadae545d98\") " Apr 22 14:23:40.756039 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.756009 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-bundle" (OuterVolumeSpecName: "bundle") pod "6d48598e-75f2-425d-b2b3-adadae545d98" (UID: "6d48598e-75f2-425d-b2b3-adadae545d98"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:40.757561 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.757525 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d48598e-75f2-425d-b2b3-adadae545d98-kube-api-access-6spbd" (OuterVolumeSpecName: "kube-api-access-6spbd") pod "6d48598e-75f2-425d-b2b3-adadae545d98" (UID: "6d48598e-75f2-425d-b2b3-adadae545d98"). InnerVolumeSpecName "kube-api-access-6spbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:40.760488 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.760453 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-util" (OuterVolumeSpecName: "util") pod "6d48598e-75f2-425d-b2b3-adadae545d98" (UID: "6d48598e-75f2-425d-b2b3-adadae545d98"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:40.856566 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.856518 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-bundle\") pod \"b396ee1d-9125-4380-8108-88297a483b24\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " Apr 22 14:23:40.856566 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.856572 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxpzq\" (UniqueName: \"kubernetes.io/projected/99dc2b8a-b3f3-4614-acfc-48949528d76a-kube-api-access-cxpzq\") pod \"99dc2b8a-b3f3-4614-acfc-48949528d76a\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " Apr 22 14:23:40.856903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.856606 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8cl2\" (UniqueName: \"kubernetes.io/projected/b396ee1d-9125-4380-8108-88297a483b24-kube-api-access-h8cl2\") pod \"b396ee1d-9125-4380-8108-88297a483b24\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " Apr 22 14:23:40.856903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.856621 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-bundle\") pod \"99dc2b8a-b3f3-4614-acfc-48949528d76a\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " Apr 22 14:23:40.856903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.856640 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-bundle\") pod \"fd80229c-7d84-4846-8c33-eb679f2366dd\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " Apr 22 14:23:40.856903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.856713 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz68j\" (UniqueName: \"kubernetes.io/projected/fd80229c-7d84-4846-8c33-eb679f2366dd-kube-api-access-vz68j\") pod \"fd80229c-7d84-4846-8c33-eb679f2366dd\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " Apr 22 14:23:40.856903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.856753 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-util\") pod \"99dc2b8a-b3f3-4614-acfc-48949528d76a\" (UID: \"99dc2b8a-b3f3-4614-acfc-48949528d76a\") " Apr 22 14:23:40.856903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.856779 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-util\") pod \"b396ee1d-9125-4380-8108-88297a483b24\" (UID: \"b396ee1d-9125-4380-8108-88297a483b24\") " Apr 22 14:23:40.856903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.856848 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-util\") pod \"fd80229c-7d84-4846-8c33-eb679f2366dd\" (UID: \"fd80229c-7d84-4846-8c33-eb679f2366dd\") " Apr 22 14:23:40.857243 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.857095 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6spbd\" (UniqueName: \"kubernetes.io/projected/6d48598e-75f2-425d-b2b3-adadae545d98-kube-api-access-6spbd\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.857243 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.857115 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-bundle\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.857243 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.857127 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d48598e-75f2-425d-b2b3-adadae545d98-util\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.857410 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.857380 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-bundle" (OuterVolumeSpecName: "bundle") pod "b396ee1d-9125-4380-8108-88297a483b24" (UID: "b396ee1d-9125-4380-8108-88297a483b24"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:40.857524 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.857498 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-bundle" (OuterVolumeSpecName: "bundle") pod "99dc2b8a-b3f3-4614-acfc-48949528d76a" (UID: "99dc2b8a-b3f3-4614-acfc-48949528d76a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:40.857591 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.857530 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-bundle" (OuterVolumeSpecName: "bundle") pod "fd80229c-7d84-4846-8c33-eb679f2366dd" (UID: "fd80229c-7d84-4846-8c33-eb679f2366dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:40.858939 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.858909 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99dc2b8a-b3f3-4614-acfc-48949528d76a-kube-api-access-cxpzq" (OuterVolumeSpecName: "kube-api-access-cxpzq") pod "99dc2b8a-b3f3-4614-acfc-48949528d76a" (UID: "99dc2b8a-b3f3-4614-acfc-48949528d76a"). InnerVolumeSpecName "kube-api-access-cxpzq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:40.859295 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.859273 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b396ee1d-9125-4380-8108-88297a483b24-kube-api-access-h8cl2" (OuterVolumeSpecName: "kube-api-access-h8cl2") pod "b396ee1d-9125-4380-8108-88297a483b24" (UID: "b396ee1d-9125-4380-8108-88297a483b24"). InnerVolumeSpecName "kube-api-access-h8cl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:40.859792 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.859769 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd80229c-7d84-4846-8c33-eb679f2366dd-kube-api-access-vz68j" (OuterVolumeSpecName: "kube-api-access-vz68j") pod "fd80229c-7d84-4846-8c33-eb679f2366dd" (UID: "fd80229c-7d84-4846-8c33-eb679f2366dd"). InnerVolumeSpecName "kube-api-access-vz68j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:40.863194 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.863170 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-util" (OuterVolumeSpecName: "util") pod "fd80229c-7d84-4846-8c33-eb679f2366dd" (UID: "fd80229c-7d84-4846-8c33-eb679f2366dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:40.863308 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.863289 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-util" (OuterVolumeSpecName: "util") pod "99dc2b8a-b3f3-4614-acfc-48949528d76a" (UID: "99dc2b8a-b3f3-4614-acfc-48949528d76a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:40.863465 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.863445 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-util" (OuterVolumeSpecName: "util") pod "b396ee1d-9125-4380-8108-88297a483b24" (UID: "b396ee1d-9125-4380-8108-88297a483b24"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:23:40.958297 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.958272 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-util\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.958297 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.958298 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-util\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.958444 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.958311 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b396ee1d-9125-4380-8108-88297a483b24-bundle\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.958444 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.958320 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxpzq\" (UniqueName: \"kubernetes.io/projected/99dc2b8a-b3f3-4614-acfc-48949528d76a-kube-api-access-cxpzq\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.958444 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.958329 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8cl2\" (UniqueName: \"kubernetes.io/projected/b396ee1d-9125-4380-8108-88297a483b24-kube-api-access-h8cl2\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.958444 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.958338 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-bundle\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.958444 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.958346 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd80229c-7d84-4846-8c33-eb679f2366dd-bundle\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.958444 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.958357 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vz68j\" (UniqueName: \"kubernetes.io/projected/fd80229c-7d84-4846-8c33-eb679f2366dd-kube-api-access-vz68j\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:40.958444 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:40.958366 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99dc2b8a-b3f3-4614-acfc-48949528d76a-util\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:23:41.501060 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.501015 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" event={"ID":"6d48598e-75f2-425d-b2b3-adadae545d98","Type":"ContainerDied","Data":"e168055298d753ae58947a6479e2635ffad805d1c6dab91e8bccfe60d7ffe85d"} Apr 22 14:23:41.501060 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.501057 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e168055298d753ae58947a6479e2635ffad805d1c6dab91e8bccfe60d7ffe85d" Apr 22 14:23:41.501293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.501072 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88gzsqh" Apr 22 14:23:41.502743 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.502721 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" event={"ID":"99dc2b8a-b3f3-4614-acfc-48949528d76a","Type":"ContainerDied","Data":"751db56d23923acec8ca12acc8c34c0fd62f0321580f23ddf803ec46f06c3399"} Apr 22 14:23:41.502743 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.502741 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bb68lx" Apr 22 14:23:41.502900 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.502745 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="751db56d23923acec8ca12acc8c34c0fd62f0321580f23ddf803ec46f06c3399" Apr 22 14:23:41.504642 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.504607 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" event={"ID":"b396ee1d-9125-4380-8108-88297a483b24","Type":"ContainerDied","Data":"87ed935443216fe47ee67e042179b7728c13fb7f1aa2a9ad593c1760d180c05e"} Apr 22 14:23:41.504776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.504640 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87ed935443216fe47ee67e042179b7728c13fb7f1aa2a9ad593c1760d180c05e" Apr 22 14:23:41.504776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.504615 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pqb44" Apr 22 14:23:41.506678 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.506525 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" event={"ID":"fd80229c-7d84-4846-8c33-eb679f2366dd","Type":"ContainerDied","Data":"8d1f242bdf74c7a3cb42f6dc258caed968805af1d0f34f7e6136cb438cc86eea"} Apr 22 14:23:41.506678 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.506563 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d1f242bdf74c7a3cb42f6dc258caed968805af1d0f34f7e6136cb438cc86eea" Apr 22 14:23:41.506678 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:41.506611 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30p9rn7" Apr 22 14:23:50.560560 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560484 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-dcrzg"] Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560760 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerName="extract" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560771 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerName="extract" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560781 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b396ee1d-9125-4380-8108-88297a483b24" containerName="util" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560787 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b396ee1d-9125-4380-8108-88297a483b24" containerName="util" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560794 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d48598e-75f2-425d-b2b3-adadae545d98" containerName="pull" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560799 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d48598e-75f2-425d-b2b3-adadae545d98" containerName="pull" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560806 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerName="pull" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560811 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerName="pull" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560819 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerName="util" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560824 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerName="util" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560830 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b396ee1d-9125-4380-8108-88297a483b24" containerName="pull" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560835 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b396ee1d-9125-4380-8108-88297a483b24" containerName="pull" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560841 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d48598e-75f2-425d-b2b3-adadae545d98" containerName="extract" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560846 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d48598e-75f2-425d-b2b3-adadae545d98" containerName="extract" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560851 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerName="extract" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560856 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerName="extract" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560868 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b396ee1d-9125-4380-8108-88297a483b24" containerName="extract" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560872 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b396ee1d-9125-4380-8108-88297a483b24" containerName="extract" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560879 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d48598e-75f2-425d-b2b3-adadae545d98" containerName="util" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560884 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d48598e-75f2-425d-b2b3-adadae545d98" containerName="util" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560890 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerName="util" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560896 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerName="util" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560903 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerName="pull" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560908 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerName="pull" Apr 22 14:23:50.560935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560945 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d48598e-75f2-425d-b2b3-adadae545d98" containerName="extract" Apr 22 14:23:50.561607 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560952 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="b396ee1d-9125-4380-8108-88297a483b24" containerName="extract" Apr 22 14:23:50.561607 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560961 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd80229c-7d84-4846-8c33-eb679f2366dd" containerName="extract" Apr 22 14:23:50.561607 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.560968 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="99dc2b8a-b3f3-4614-acfc-48949528d76a" containerName="extract" Apr 22 14:23:50.569197 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.569174 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" Apr 22 14:23:50.572067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.572049 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 14:23:50.573412 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.573396 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-wbg7t\"" Apr 22 14:23:50.573412 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.573406 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 14:23:50.579443 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.579422 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-dcrzg"] Apr 22 14:23:50.638367 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.638329 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrnwn\" (UniqueName: \"kubernetes.io/projected/97ac6cd5-d3a8-48eb-a86e-fcebb2a78549-kube-api-access-mrnwn\") pod \"authorino-operator-7587b89b76-dcrzg\" (UID: \"97ac6cd5-d3a8-48eb-a86e-fcebb2a78549\") " pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" Apr 22 14:23:50.739789 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.739757 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrnwn\" (UniqueName: \"kubernetes.io/projected/97ac6cd5-d3a8-48eb-a86e-fcebb2a78549-kube-api-access-mrnwn\") pod \"authorino-operator-7587b89b76-dcrzg\" (UID: \"97ac6cd5-d3a8-48eb-a86e-fcebb2a78549\") " pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" Apr 22 14:23:50.756061 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.756033 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrnwn\" (UniqueName: \"kubernetes.io/projected/97ac6cd5-d3a8-48eb-a86e-fcebb2a78549-kube-api-access-mrnwn\") pod \"authorino-operator-7587b89b76-dcrzg\" (UID: \"97ac6cd5-d3a8-48eb-a86e-fcebb2a78549\") " pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" Apr 22 14:23:50.880759 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:50.880721 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" Apr 22 14:23:51.031338 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:51.031310 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-dcrzg"] Apr 22 14:23:51.033903 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:51.033868 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ac6cd5_d3a8_48eb_a86e_fcebb2a78549.slice/crio-d43844e3d248f347c7fcdf515f0af2ffa8c0b862acdb01c3fe068ee50db655b8 WatchSource:0}: Error finding container d43844e3d248f347c7fcdf515f0af2ffa8c0b862acdb01c3fe068ee50db655b8: Status 404 returned error can't find the container with id d43844e3d248f347c7fcdf515f0af2ffa8c0b862acdb01c3fe068ee50db655b8 Apr 22 14:23:51.540688 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:51.540638 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" event={"ID":"97ac6cd5-d3a8-48eb-a86e-fcebb2a78549","Type":"ContainerStarted","Data":"d43844e3d248f347c7fcdf515f0af2ffa8c0b862acdb01c3fe068ee50db655b8"} Apr 22 14:23:53.549418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:53.549373 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" event={"ID":"97ac6cd5-d3a8-48eb-a86e-fcebb2a78549","Type":"ContainerStarted","Data":"da55cf73d547da3b9177707d6a909a1c1c218e128bc1f4fa0b9f2a632fa85b92"} Apr 22 14:23:53.549876 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:53.549447 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" Apr 22 14:23:53.581071 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:53.581013 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" podStartSLOduration=1.192549399 podStartE2EDuration="3.58099719s" podCreationTimestamp="2026-04-22 14:23:50 +0000 UTC" firstStartedPulling="2026-04-22 14:23:51.035993517 +0000 UTC m=+534.735354177" lastFinishedPulling="2026-04-22 14:23:53.424441308 +0000 UTC m=+537.123801968" observedRunningTime="2026-04-22 14:23:53.578081636 +0000 UTC m=+537.277442319" watchObservedRunningTime="2026-04-22 14:23:53.58099719 +0000 UTC m=+537.280357865" Apr 22 14:23:56.210318 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.210282 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85"] Apr 22 14:23:56.213590 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.213573 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:23:56.216904 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.216880 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-kgtzp\"" Apr 22 14:23:56.227232 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.227206 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85"] Apr 22 14:23:56.284552 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.284514 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd91f939-be95-434d-b3a6-d197f8ea66fb-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-fhq85\" (UID: \"dd91f939-be95-434d-b3a6-d197f8ea66fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:23:56.284742 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.284557 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxkcq\" (UniqueName: \"kubernetes.io/projected/dd91f939-be95-434d-b3a6-d197f8ea66fb-kube-api-access-hxkcq\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-fhq85\" (UID: \"dd91f939-be95-434d-b3a6-d197f8ea66fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:23:56.385683 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.385594 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd91f939-be95-434d-b3a6-d197f8ea66fb-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-fhq85\" (UID: \"dd91f939-be95-434d-b3a6-d197f8ea66fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:23:56.385859 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.385700 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxkcq\" (UniqueName: \"kubernetes.io/projected/dd91f939-be95-434d-b3a6-d197f8ea66fb-kube-api-access-hxkcq\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-fhq85\" (UID: \"dd91f939-be95-434d-b3a6-d197f8ea66fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:23:56.385978 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.385958 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd91f939-be95-434d-b3a6-d197f8ea66fb-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-fhq85\" (UID: \"dd91f939-be95-434d-b3a6-d197f8ea66fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:23:56.394584 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.394555 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxkcq\" (UniqueName: \"kubernetes.io/projected/dd91f939-be95-434d-b3a6-d197f8ea66fb-kube-api-access-hxkcq\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-fhq85\" (UID: \"dd91f939-be95-434d-b3a6-d197f8ea66fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:23:56.523749 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.523644 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:23:56.645371 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:56.645338 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85"] Apr 22 14:23:56.646490 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:23:56.646461 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd91f939_be95_434d_b3a6_d197f8ea66fb.slice/crio-06495ca43b9757571fd4ac6be0ddb18c9b642d20ce7ee28db1dfb66cd08a9915 WatchSource:0}: Error finding container 06495ca43b9757571fd4ac6be0ddb18c9b642d20ce7ee28db1dfb66cd08a9915: Status 404 returned error can't find the container with id 06495ca43b9757571fd4ac6be0ddb18c9b642d20ce7ee28db1dfb66cd08a9915 Apr 22 14:23:57.566110 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:23:57.566052 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" event={"ID":"dd91f939-be95-434d-b3a6-d197f8ea66fb","Type":"ContainerStarted","Data":"06495ca43b9757571fd4ac6be0ddb18c9b642d20ce7ee28db1dfb66cd08a9915"} Apr 22 14:24:01.584197 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:01.584157 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" event={"ID":"dd91f939-be95-434d-b3a6-d197f8ea66fb","Type":"ContainerStarted","Data":"d74a2b5b737c0d695b961a534807368169c49e9db980323bbdff3b1f042b178c"} Apr 22 14:24:01.584715 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:01.584292 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:24:01.608156 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:01.608104 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" podStartSLOduration=1.721508632 podStartE2EDuration="5.608089606s" podCreationTimestamp="2026-04-22 14:23:56 +0000 UTC" firstStartedPulling="2026-04-22 14:23:56.648942718 +0000 UTC m=+540.348303385" lastFinishedPulling="2026-04-22 14:24:00.535523699 +0000 UTC m=+544.234884359" observedRunningTime="2026-04-22 14:24:01.6059057 +0000 UTC m=+545.305266383" watchObservedRunningTime="2026-04-22 14:24:01.608089606 +0000 UTC m=+545.307450288" Apr 22 14:24:04.554724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:04.554689 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-dcrzg" Apr 22 14:24:12.589981 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:12.589952 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-fhq85" Apr 22 14:24:45.869499 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.869462 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-85cgp"] Apr 22 14:24:45.878636 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.878610 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:24:45.881733 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.881702 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pkxlf\"" Apr 22 14:24:45.884750 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.884723 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-85cgp"] Apr 22 14:24:45.886616 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.886591 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 14:24:45.898902 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.898875 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/800b1787-62f8-4297-a35c-9bab9e7e7c78-config-file\") pod \"limitador-limitador-67566c68b4-85cgp\" (UID: \"800b1787-62f8-4297-a35c-9bab9e7e7c78\") " pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:24:45.899023 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.898972 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vwfp\" (UniqueName: \"kubernetes.io/projected/800b1787-62f8-4297-a35c-9bab9e7e7c78-kube-api-access-6vwfp\") pod \"limitador-limitador-67566c68b4-85cgp\" (UID: \"800b1787-62f8-4297-a35c-9bab9e7e7c78\") " pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:24:45.900856 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.900838 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-85cgp"] Apr 22 14:24:45.999458 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.999422 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vwfp\" (UniqueName: \"kubernetes.io/projected/800b1787-62f8-4297-a35c-9bab9e7e7c78-kube-api-access-6vwfp\") pod \"limitador-limitador-67566c68b4-85cgp\" (UID: \"800b1787-62f8-4297-a35c-9bab9e7e7c78\") " pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:24:45.999633 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:45.999468 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/800b1787-62f8-4297-a35c-9bab9e7e7c78-config-file\") pod \"limitador-limitador-67566c68b4-85cgp\" (UID: \"800b1787-62f8-4297-a35c-9bab9e7e7c78\") " pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:24:46.000062 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:46.000042 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/800b1787-62f8-4297-a35c-9bab9e7e7c78-config-file\") pod \"limitador-limitador-67566c68b4-85cgp\" (UID: \"800b1787-62f8-4297-a35c-9bab9e7e7c78\") " pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:24:46.009624 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:46.009596 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vwfp\" (UniqueName: \"kubernetes.io/projected/800b1787-62f8-4297-a35c-9bab9e7e7c78-kube-api-access-6vwfp\") pod \"limitador-limitador-67566c68b4-85cgp\" (UID: \"800b1787-62f8-4297-a35c-9bab9e7e7c78\") " pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:24:46.189913 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:46.189838 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:24:46.313832 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:46.313801 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-85cgp"] Apr 22 14:24:46.314559 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:24:46.314535 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800b1787_62f8_4297_a35c_9bab9e7e7c78.slice/crio-2131af70866cce75441aa5194c710d5ccdde782b567b66615ccfd7ba9791ebfc WatchSource:0}: Error finding container 2131af70866cce75441aa5194c710d5ccdde782b567b66615ccfd7ba9791ebfc: Status 404 returned error can't find the container with id 2131af70866cce75441aa5194c710d5ccdde782b567b66615ccfd7ba9791ebfc Apr 22 14:24:46.738101 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:46.738065 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" event={"ID":"800b1787-62f8-4297-a35c-9bab9e7e7c78","Type":"ContainerStarted","Data":"2131af70866cce75441aa5194c710d5ccdde782b567b66615ccfd7ba9791ebfc"} Apr 22 14:24:50.757794 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:50.757759 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" event={"ID":"800b1787-62f8-4297-a35c-9bab9e7e7c78","Type":"ContainerStarted","Data":"fc8b0ae8cde52cae61c9402b8e350d122555115e719ce71e2597e14ea9db322b"} Apr 22 14:24:50.758208 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:50.757946 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:24:50.778450 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:50.778403 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" podStartSLOduration=1.790652154 podStartE2EDuration="5.778389035s" podCreationTimestamp="2026-04-22 14:24:45 +0000 UTC" firstStartedPulling="2026-04-22 14:24:46.317033995 +0000 UTC m=+590.016394658" lastFinishedPulling="2026-04-22 14:24:50.304770876 +0000 UTC m=+594.004131539" observedRunningTime="2026-04-22 14:24:50.775457847 +0000 UTC m=+594.474818531" watchObservedRunningTime="2026-04-22 14:24:50.778389035 +0000 UTC m=+594.477749716" Apr 22 14:24:56.836701 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:56.836671 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:24:56.837510 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:24:56.837490 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:25:01.767439 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:01.767411 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-85cgp" Apr 22 14:25:26.093150 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.093070 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj"] Apr 22 14:25:26.093504 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.093393 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" podUID="1a032912-f35e-4dcc-a2a7-fed9e85e297a" containerName="discovery" containerID="cri-o://9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2" gracePeriod=30 Apr 22 14:25:26.358013 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.357989 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:25:26.438345 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.438314 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-cacerts\") pod \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " Apr 22 14:25:26.438512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.438355 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1a032912-f35e-4dcc-a2a7-fed9e85e297a-local-certs\") pod \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " Apr 22 14:25:26.438512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.438385 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-token\") pod \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " Apr 22 14:25:26.438512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.438403 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb7z8\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-kube-api-access-jb7z8\") pod \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " Apr 22 14:25:26.438512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.438453 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-kubeconfig\") pod \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " Apr 22 14:25:26.438512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.438495 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-dns-cert\") pod \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " Apr 22 14:25:26.438787 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.438527 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-ca-configmap\") pod \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\" (UID: \"1a032912-f35e-4dcc-a2a7-fed9e85e297a\") " Apr 22 14:25:26.439101 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.439044 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "1a032912-f35e-4dcc-a2a7-fed9e85e297a" (UID: "1a032912-f35e-4dcc-a2a7-fed9e85e297a"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:25:26.441185 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.441150 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-cacerts" (OuterVolumeSpecName: "cacerts") pod "1a032912-f35e-4dcc-a2a7-fed9e85e297a" (UID: "1a032912-f35e-4dcc-a2a7-fed9e85e297a"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:25:26.441290 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.441158 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-kube-api-access-jb7z8" (OuterVolumeSpecName: "kube-api-access-jb7z8") pod "1a032912-f35e-4dcc-a2a7-fed9e85e297a" (UID: "1a032912-f35e-4dcc-a2a7-fed9e85e297a"). InnerVolumeSpecName "kube-api-access-jb7z8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:25:26.441839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.441738 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a032912-f35e-4dcc-a2a7-fed9e85e297a-local-certs" (OuterVolumeSpecName: "local-certs") pod "1a032912-f35e-4dcc-a2a7-fed9e85e297a" (UID: "1a032912-f35e-4dcc-a2a7-fed9e85e297a"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:25:26.441839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.441743 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "1a032912-f35e-4dcc-a2a7-fed9e85e297a" (UID: "1a032912-f35e-4dcc-a2a7-fed9e85e297a"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:25:26.441839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.441776 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-token" (OuterVolumeSpecName: "istio-token") pod "1a032912-f35e-4dcc-a2a7-fed9e85e297a" (UID: "1a032912-f35e-4dcc-a2a7-fed9e85e297a"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:25:26.441985 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.441959 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "1a032912-f35e-4dcc-a2a7-fed9e85e297a" (UID: "1a032912-f35e-4dcc-a2a7-fed9e85e297a"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:25:26.540038 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.540004 2562 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-cacerts\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:25:26.540038 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.540032 2562 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1a032912-f35e-4dcc-a2a7-fed9e85e297a-local-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:25:26.540038 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.540045 2562 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-token\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:25:26.540267 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.540055 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jb7z8\" (UniqueName: \"kubernetes.io/projected/1a032912-f35e-4dcc-a2a7-fed9e85e297a-kube-api-access-jb7z8\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:25:26.540267 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.540063 2562 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-kubeconfig\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:25:26.540267 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.540072 2562 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-dns-cert\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:25:26.540267 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.540080 2562 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1a032912-f35e-4dcc-a2a7-fed9e85e297a-istio-csr-ca-configmap\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:25:26.896017 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.895981 2562 generic.go:358] "Generic (PLEG): container finished" podID="1a032912-f35e-4dcc-a2a7-fed9e85e297a" containerID="9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2" exitCode=0 Apr 22 14:25:26.896220 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.896069 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" Apr 22 14:25:26.896220 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.896077 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" event={"ID":"1a032912-f35e-4dcc-a2a7-fed9e85e297a","Type":"ContainerDied","Data":"9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2"} Apr 22 14:25:26.896220 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.896131 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj" event={"ID":"1a032912-f35e-4dcc-a2a7-fed9e85e297a","Type":"ContainerDied","Data":"840931c7772e3cdb1aeac4a965cc5d64cc361c1847a6226a7e53513ddbcc862f"} Apr 22 14:25:26.896220 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.896168 2562 scope.go:117] "RemoveContainer" containerID="9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2" Apr 22 14:25:26.905779 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.905759 2562 scope.go:117] "RemoveContainer" containerID="9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2" Apr 22 14:25:26.906025 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:25:26.906001 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2\": container with ID starting with 9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2 not found: ID does not exist" containerID="9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2" Apr 22 14:25:26.906086 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.906033 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2"} err="failed to get container status \"9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2\": rpc error: code = NotFound desc = could not find container \"9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2\": container with ID starting with 9bee7920984172386f35b62f80cf6c56ae39f0593cfc686a308c01195c7856e2 not found: ID does not exist" Apr 22 14:25:26.926099 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.926077 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj"] Apr 22 14:25:26.931148 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:26.931124 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8rhj"] Apr 22 14:25:28.925773 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:28.925738 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a032912-f35e-4dcc-a2a7-fed9e85e297a" path="/var/lib/kubelet/pods/1a032912-f35e-4dcc-a2a7-fed9e85e297a/volumes" Apr 22 14:25:32.425421 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.425391 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-8pwxc"] Apr 22 14:25:32.425796 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.425710 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a032912-f35e-4dcc-a2a7-fed9e85e297a" containerName="discovery" Apr 22 14:25:32.425796 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.425722 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a032912-f35e-4dcc-a2a7-fed9e85e297a" containerName="discovery" Apr 22 14:25:32.425796 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.425785 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a032912-f35e-4dcc-a2a7-fed9e85e297a" containerName="discovery" Apr 22 14:25:32.430140 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.430122 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:25:32.432892 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.432870 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:25:32.433018 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.432973 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:25:32.433086 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.433063 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 14:25:32.434220 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.434202 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-4jrnf\"" Apr 22 14:25:32.443910 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.443888 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-8pwxc"] Apr 22 14:25:32.454682 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.454631 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-7x2qw"] Apr 22 14:25:32.458239 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.458222 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:25:32.461434 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.461417 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 14:25:32.461532 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.461438 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-2lkr9\"" Apr 22 14:25:32.481453 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.481423 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-7x2qw"] Apr 22 14:25:32.490736 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.490703 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-clk5r"] Apr 22 14:25:32.494083 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.494059 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:25:32.497381 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.497357 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wwmwc\"" Apr 22 14:25:32.497514 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.497357 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 14:25:32.506136 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.506112 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-clk5r"] Apr 22 14:25:32.594418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.594390 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2mck\" (UniqueName: \"kubernetes.io/projected/2d4194bf-7633-499f-b5e0-b4a3418f143e-kube-api-access-j2mck\") pod \"llmisvc-controller-manager-6f8c758999-7x2qw\" (UID: \"2d4194bf-7633-499f-b5e0-b4a3418f143e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:25:32.594597 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.594428 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c3349d53-fc9e-49c8-899f-ce00642fb46b-data\") pod \"seaweedfs-86cc847c5c-clk5r\" (UID: \"c3349d53-fc9e-49c8-899f-ce00642fb46b\") " pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:25:32.594597 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.594450 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-646zl\" (UniqueName: \"kubernetes.io/projected/31a36ebc-1737-4887-b77f-abe894dd74ae-kube-api-access-646zl\") pod \"kserve-controller-manager-66cf78b85b-8pwxc\" (UID: \"31a36ebc-1737-4887-b77f-abe894dd74ae\") " pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:25:32.594597 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.594500 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a36ebc-1737-4887-b77f-abe894dd74ae-cert\") pod \"kserve-controller-manager-66cf78b85b-8pwxc\" (UID: \"31a36ebc-1737-4887-b77f-abe894dd74ae\") " pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:25:32.594597 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.594544 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d4194bf-7633-499f-b5e0-b4a3418f143e-cert\") pod \"llmisvc-controller-manager-6f8c758999-7x2qw\" (UID: \"2d4194bf-7633-499f-b5e0-b4a3418f143e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:25:32.594808 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.594669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdgjx\" (UniqueName: \"kubernetes.io/projected/c3349d53-fc9e-49c8-899f-ce00642fb46b-kube-api-access-vdgjx\") pod \"seaweedfs-86cc847c5c-clk5r\" (UID: \"c3349d53-fc9e-49c8-899f-ce00642fb46b\") " pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:25:32.695885 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.695795 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2mck\" (UniqueName: \"kubernetes.io/projected/2d4194bf-7633-499f-b5e0-b4a3418f143e-kube-api-access-j2mck\") pod \"llmisvc-controller-manager-6f8c758999-7x2qw\" (UID: \"2d4194bf-7633-499f-b5e0-b4a3418f143e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:25:32.695885 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.695838 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c3349d53-fc9e-49c8-899f-ce00642fb46b-data\") pod \"seaweedfs-86cc847c5c-clk5r\" (UID: \"c3349d53-fc9e-49c8-899f-ce00642fb46b\") " pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:25:32.695885 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.695856 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-646zl\" (UniqueName: \"kubernetes.io/projected/31a36ebc-1737-4887-b77f-abe894dd74ae-kube-api-access-646zl\") pod \"kserve-controller-manager-66cf78b85b-8pwxc\" (UID: \"31a36ebc-1737-4887-b77f-abe894dd74ae\") " pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:25:32.696158 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.695986 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a36ebc-1737-4887-b77f-abe894dd74ae-cert\") pod \"kserve-controller-manager-66cf78b85b-8pwxc\" (UID: \"31a36ebc-1737-4887-b77f-abe894dd74ae\") " pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:25:32.696158 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.696056 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d4194bf-7633-499f-b5e0-b4a3418f143e-cert\") pod \"llmisvc-controller-manager-6f8c758999-7x2qw\" (UID: \"2d4194bf-7633-499f-b5e0-b4a3418f143e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:25:32.696261 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.696200 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdgjx\" (UniqueName: \"kubernetes.io/projected/c3349d53-fc9e-49c8-899f-ce00642fb46b-kube-api-access-vdgjx\") pod \"seaweedfs-86cc847c5c-clk5r\" (UID: \"c3349d53-fc9e-49c8-899f-ce00642fb46b\") " pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:25:32.696261 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.696214 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c3349d53-fc9e-49c8-899f-ce00642fb46b-data\") pod \"seaweedfs-86cc847c5c-clk5r\" (UID: \"c3349d53-fc9e-49c8-899f-ce00642fb46b\") " pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:25:32.698573 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.698542 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d4194bf-7633-499f-b5e0-b4a3418f143e-cert\") pod \"llmisvc-controller-manager-6f8c758999-7x2qw\" (UID: \"2d4194bf-7633-499f-b5e0-b4a3418f143e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:25:32.698721 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.698547 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a36ebc-1737-4887-b77f-abe894dd74ae-cert\") pod \"kserve-controller-manager-66cf78b85b-8pwxc\" (UID: \"31a36ebc-1737-4887-b77f-abe894dd74ae\") " pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:25:32.709004 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.708979 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdgjx\" (UniqueName: \"kubernetes.io/projected/c3349d53-fc9e-49c8-899f-ce00642fb46b-kube-api-access-vdgjx\") pod \"seaweedfs-86cc847c5c-clk5r\" (UID: \"c3349d53-fc9e-49c8-899f-ce00642fb46b\") " pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:25:32.709104 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.709088 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-646zl\" (UniqueName: \"kubernetes.io/projected/31a36ebc-1737-4887-b77f-abe894dd74ae-kube-api-access-646zl\") pod \"kserve-controller-manager-66cf78b85b-8pwxc\" (UID: \"31a36ebc-1737-4887-b77f-abe894dd74ae\") " pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:25:32.714584 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.714563 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2mck\" (UniqueName: \"kubernetes.io/projected/2d4194bf-7633-499f-b5e0-b4a3418f143e-kube-api-access-j2mck\") pod \"llmisvc-controller-manager-6f8c758999-7x2qw\" (UID: \"2d4194bf-7633-499f-b5e0-b4a3418f143e\") " pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:25:32.740008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.739977 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:25:32.769296 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.769266 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:25:32.804235 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.804198 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:25:32.883252 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.883193 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-8pwxc"] Apr 22 14:25:32.887400 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:25:32.887233 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31a36ebc_1737_4887_b77f_abe894dd74ae.slice/crio-63570ed5d45796a2ddfd4ae624737812919b2fbe395a5f52bc80c91da6e4b899 WatchSource:0}: Error finding container 63570ed5d45796a2ddfd4ae624737812919b2fbe395a5f52bc80c91da6e4b899: Status 404 returned error can't find the container with id 63570ed5d45796a2ddfd4ae624737812919b2fbe395a5f52bc80c91da6e4b899 Apr 22 14:25:32.912695 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.912672 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-7x2qw"] Apr 22 14:25:32.914578 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:25:32.914541 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2d4194bf_7633_499f_b5e0_b4a3418f143e.slice/crio-ccf778a53289e53e7c38849f2e7ec39f4c4bbf8db7b51d11d5647294a4253148 WatchSource:0}: Error finding container ccf778a53289e53e7c38849f2e7ec39f4c4bbf8db7b51d11d5647294a4253148: Status 404 returned error can't find the container with id ccf778a53289e53e7c38849f2e7ec39f4c4bbf8db7b51d11d5647294a4253148 Apr 22 14:25:32.932795 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.932763 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" event={"ID":"31a36ebc-1737-4887-b77f-abe894dd74ae","Type":"ContainerStarted","Data":"63570ed5d45796a2ddfd4ae624737812919b2fbe395a5f52bc80c91da6e4b899"} Apr 22 14:25:32.932795 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.932795 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" event={"ID":"2d4194bf-7633-499f-b5e0-b4a3418f143e","Type":"ContainerStarted","Data":"ccf778a53289e53e7c38849f2e7ec39f4c4bbf8db7b51d11d5647294a4253148"} Apr 22 14:25:32.954534 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:32.954498 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-clk5r"] Apr 22 14:25:32.956271 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:25:32.956245 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3349d53_fc9e_49c8_899f_ce00642fb46b.slice/crio-1857fc538dcc7b3cf64db2b1d398b364279b358e2fc1adced5d94048a4dc22b7 WatchSource:0}: Error finding container 1857fc538dcc7b3cf64db2b1d398b364279b358e2fc1adced5d94048a4dc22b7: Status 404 returned error can't find the container with id 1857fc538dcc7b3cf64db2b1d398b364279b358e2fc1adced5d94048a4dc22b7 Apr 22 14:25:33.937417 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:33.937352 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-clk5r" event={"ID":"c3349d53-fc9e-49c8-899f-ce00642fb46b","Type":"ContainerStarted","Data":"1857fc538dcc7b3cf64db2b1d398b364279b358e2fc1adced5d94048a4dc22b7"} Apr 22 14:25:36.950310 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:36.950269 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" event={"ID":"2d4194bf-7633-499f-b5e0-b4a3418f143e","Type":"ContainerStarted","Data":"0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309"} Apr 22 14:25:36.950715 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:36.950430 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:25:36.951702 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:36.951677 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-clk5r" event={"ID":"c3349d53-fc9e-49c8-899f-ce00642fb46b","Type":"ContainerStarted","Data":"4307b8325ce15ff872481f1535e37d87c848cf18780a12e9d397c7b59ba98db4"} Apr 22 14:25:36.951825 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:36.951779 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:25:36.953043 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:36.953021 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" event={"ID":"31a36ebc-1737-4887-b77f-abe894dd74ae","Type":"ContainerStarted","Data":"95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13"} Apr 22 14:25:36.953163 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:36.953149 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:25:36.996465 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:36.996376 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" podStartSLOduration=1.078664012 podStartE2EDuration="4.996359523s" podCreationTimestamp="2026-04-22 14:25:32 +0000 UTC" firstStartedPulling="2026-04-22 14:25:32.888446234 +0000 UTC m=+636.587806897" lastFinishedPulling="2026-04-22 14:25:36.806141747 +0000 UTC m=+640.505502408" observedRunningTime="2026-04-22 14:25:36.993394699 +0000 UTC m=+640.692755383" watchObservedRunningTime="2026-04-22 14:25:36.996359523 +0000 UTC m=+640.695720205" Apr 22 14:25:37.015909 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:37.015859 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-clk5r" podStartSLOduration=1.108388602 podStartE2EDuration="5.0158447s" podCreationTimestamp="2026-04-22 14:25:32 +0000 UTC" firstStartedPulling="2026-04-22 14:25:32.957414423 +0000 UTC m=+636.656775084" lastFinishedPulling="2026-04-22 14:25:36.864870515 +0000 UTC m=+640.564231182" observedRunningTime="2026-04-22 14:25:37.01501208 +0000 UTC m=+640.714372762" watchObservedRunningTime="2026-04-22 14:25:37.0158447 +0000 UTC m=+640.715205382" Apr 22 14:25:37.038757 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:37.038703 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" podStartSLOduration=1.148425152 podStartE2EDuration="5.038688089s" podCreationTimestamp="2026-04-22 14:25:32 +0000 UTC" firstStartedPulling="2026-04-22 14:25:32.915925164 +0000 UTC m=+636.615285824" lastFinishedPulling="2026-04-22 14:25:36.806188102 +0000 UTC m=+640.505548761" observedRunningTime="2026-04-22 14:25:37.037104769 +0000 UTC m=+640.736465472" watchObservedRunningTime="2026-04-22 14:25:37.038688089 +0000 UTC m=+640.738048770" Apr 22 14:25:42.958083 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:25:42.958051 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-clk5r" Apr 22 14:26:07.958723 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:07.958688 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:26:07.962501 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:07.962480 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:26:09.285608 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.285567 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-8pwxc"] Apr 22 14:26:09.286029 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.285816 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" podUID="31a36ebc-1737-4887-b77f-abe894dd74ae" containerName="manager" containerID="cri-o://95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13" gracePeriod=10 Apr 22 14:26:09.319570 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.319540 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-9x4sc"] Apr 22 14:26:09.323034 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.323018 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:09.330945 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.330920 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-9x4sc"] Apr 22 14:26:09.406222 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.406196 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1129478a-b9a3-4f2e-93d8-ec5a234a2054-cert\") pod \"kserve-controller-manager-66cf78b85b-9x4sc\" (UID: \"1129478a-b9a3-4f2e-93d8-ec5a234a2054\") " pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:09.406337 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.406239 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2fqd\" (UniqueName: \"kubernetes.io/projected/1129478a-b9a3-4f2e-93d8-ec5a234a2054-kube-api-access-d2fqd\") pod \"kserve-controller-manager-66cf78b85b-9x4sc\" (UID: \"1129478a-b9a3-4f2e-93d8-ec5a234a2054\") " pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:09.507186 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.507151 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1129478a-b9a3-4f2e-93d8-ec5a234a2054-cert\") pod \"kserve-controller-manager-66cf78b85b-9x4sc\" (UID: \"1129478a-b9a3-4f2e-93d8-ec5a234a2054\") " pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:09.507345 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.507198 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2fqd\" (UniqueName: \"kubernetes.io/projected/1129478a-b9a3-4f2e-93d8-ec5a234a2054-kube-api-access-d2fqd\") pod \"kserve-controller-manager-66cf78b85b-9x4sc\" (UID: \"1129478a-b9a3-4f2e-93d8-ec5a234a2054\") " pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:09.509738 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.509715 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1129478a-b9a3-4f2e-93d8-ec5a234a2054-cert\") pod \"kserve-controller-manager-66cf78b85b-9x4sc\" (UID: \"1129478a-b9a3-4f2e-93d8-ec5a234a2054\") " pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:09.516926 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.516899 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2fqd\" (UniqueName: \"kubernetes.io/projected/1129478a-b9a3-4f2e-93d8-ec5a234a2054-kube-api-access-d2fqd\") pod \"kserve-controller-manager-66cf78b85b-9x4sc\" (UID: \"1129478a-b9a3-4f2e-93d8-ec5a234a2054\") " pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:09.527009 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.526989 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:26:09.607850 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.607821 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a36ebc-1737-4887-b77f-abe894dd74ae-cert\") pod \"31a36ebc-1737-4887-b77f-abe894dd74ae\" (UID: \"31a36ebc-1737-4887-b77f-abe894dd74ae\") " Apr 22 14:26:09.608007 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.607885 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-646zl\" (UniqueName: \"kubernetes.io/projected/31a36ebc-1737-4887-b77f-abe894dd74ae-kube-api-access-646zl\") pod \"31a36ebc-1737-4887-b77f-abe894dd74ae\" (UID: \"31a36ebc-1737-4887-b77f-abe894dd74ae\") " Apr 22 14:26:09.609885 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.609856 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a36ebc-1737-4887-b77f-abe894dd74ae-cert" (OuterVolumeSpecName: "cert") pod "31a36ebc-1737-4887-b77f-abe894dd74ae" (UID: "31a36ebc-1737-4887-b77f-abe894dd74ae"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:26:09.609997 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.609907 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a36ebc-1737-4887-b77f-abe894dd74ae-kube-api-access-646zl" (OuterVolumeSpecName: "kube-api-access-646zl") pod "31a36ebc-1737-4887-b77f-abe894dd74ae" (UID: "31a36ebc-1737-4887-b77f-abe894dd74ae"). InnerVolumeSpecName "kube-api-access-646zl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:26:09.661778 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.661736 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:09.709049 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.709023 2562 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a36ebc-1737-4887-b77f-abe894dd74ae-cert\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:26:09.709163 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.709058 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-646zl\" (UniqueName: \"kubernetes.io/projected/31a36ebc-1737-4887-b77f-abe894dd74ae-kube-api-access-646zl\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:26:09.782917 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.782887 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-9x4sc"] Apr 22 14:26:09.783871 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:26:09.783836 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1129478a_b9a3_4f2e_93d8_ec5a234a2054.slice/crio-009593f2f532ab1db14993dfd6a41918554479a0a2317722e117132ac8b05aa3 WatchSource:0}: Error finding container 009593f2f532ab1db14993dfd6a41918554479a0a2317722e117132ac8b05aa3: Status 404 returned error can't find the container with id 009593f2f532ab1db14993dfd6a41918554479a0a2317722e117132ac8b05aa3 Apr 22 14:26:09.785146 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:09.785127 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:26:10.075189 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.075095 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" event={"ID":"1129478a-b9a3-4f2e-93d8-ec5a234a2054","Type":"ContainerStarted","Data":"009593f2f532ab1db14993dfd6a41918554479a0a2317722e117132ac8b05aa3"} Apr 22 14:26:10.076221 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.076199 2562 generic.go:358] "Generic (PLEG): container finished" podID="31a36ebc-1737-4887-b77f-abe894dd74ae" containerID="95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13" exitCode=0 Apr 22 14:26:10.076301 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.076252 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" event={"ID":"31a36ebc-1737-4887-b77f-abe894dd74ae","Type":"ContainerDied","Data":"95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13"} Apr 22 14:26:10.076301 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.076259 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" Apr 22 14:26:10.076301 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.076269 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-8pwxc" event={"ID":"31a36ebc-1737-4887-b77f-abe894dd74ae","Type":"ContainerDied","Data":"63570ed5d45796a2ddfd4ae624737812919b2fbe395a5f52bc80c91da6e4b899"} Apr 22 14:26:10.076301 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.076284 2562 scope.go:117] "RemoveContainer" containerID="95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13" Apr 22 14:26:10.085253 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.085232 2562 scope.go:117] "RemoveContainer" containerID="95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13" Apr 22 14:26:10.085501 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:26:10.085483 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13\": container with ID starting with 95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13 not found: ID does not exist" containerID="95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13" Apr 22 14:26:10.085572 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.085507 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13"} err="failed to get container status \"95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13\": rpc error: code = NotFound desc = could not find container \"95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13\": container with ID starting with 95e92eef9f628c34594ae763c3fa3105920f205146ce8391e5d1c2d0fa8fef13 not found: ID does not exist" Apr 22 14:26:10.099424 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.099393 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-8pwxc"] Apr 22 14:26:10.103074 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.103047 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-8pwxc"] Apr 22 14:26:10.925542 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:10.925511 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a36ebc-1737-4887-b77f-abe894dd74ae" path="/var/lib/kubelet/pods/31a36ebc-1737-4887-b77f-abe894dd74ae/volumes" Apr 22 14:26:11.080757 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:11.080720 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" event={"ID":"1129478a-b9a3-4f2e-93d8-ec5a234a2054","Type":"ContainerStarted","Data":"be3379722c13c09b69ba73869790921ec482253a85aab5320938bbffea33e90d"} Apr 22 14:26:11.080941 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:11.080888 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:11.099370 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:11.099318 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" podStartSLOduration=1.7498198280000001 podStartE2EDuration="2.09930317s" podCreationTimestamp="2026-04-22 14:26:09 +0000 UTC" firstStartedPulling="2026-04-22 14:26:09.785262736 +0000 UTC m=+673.484623397" lastFinishedPulling="2026-04-22 14:26:10.134746075 +0000 UTC m=+673.834106739" observedRunningTime="2026-04-22 14:26:11.097668781 +0000 UTC m=+674.797029463" watchObservedRunningTime="2026-04-22 14:26:11.09930317 +0000 UTC m=+674.798663852" Apr 22 14:26:42.090788 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:42.090756 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-9x4sc" Apr 22 14:26:43.127680 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.127617 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-kz7zg"] Apr 22 14:26:43.128125 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.128105 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31a36ebc-1737-4887-b77f-abe894dd74ae" containerName="manager" Apr 22 14:26:43.128125 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.128124 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a36ebc-1737-4887-b77f-abe894dd74ae" containerName="manager" Apr 22 14:26:43.128239 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.128213 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="31a36ebc-1737-4887-b77f-abe894dd74ae" containerName="manager" Apr 22 14:26:43.131297 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.131277 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:43.134816 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.134796 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-7khlt\"" Apr 22 14:26:43.134922 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.134805 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 14:26:43.145948 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.145926 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-wm5rq"] Apr 22 14:26:43.149154 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.149137 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:43.151758 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.151741 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-lq64m\"" Apr 22 14:26:43.152060 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.152046 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 14:26:43.161305 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.161285 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-kz7zg"] Apr 22 14:26:43.165084 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.165062 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wm5rq"] Apr 22 14:26:43.186839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.186810 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/060ac850-d845-4d76-ba56-b12612e9def6-cert\") pod \"odh-model-controller-696fc77849-wm5rq\" (UID: \"060ac850-d845-4d76-ba56-b12612e9def6\") " pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:43.186955 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.186846 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/404f0edd-0396-444b-9c49-e2eef58db60b-tls-certs\") pod \"model-serving-api-86f7b4b499-kz7zg\" (UID: \"404f0edd-0396-444b-9c49-e2eef58db60b\") " pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:43.187001 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.186950 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgx7l\" (UniqueName: \"kubernetes.io/projected/060ac850-d845-4d76-ba56-b12612e9def6-kube-api-access-tgx7l\") pod \"odh-model-controller-696fc77849-wm5rq\" (UID: \"060ac850-d845-4d76-ba56-b12612e9def6\") " pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:43.187058 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.187041 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlz7g\" (UniqueName: \"kubernetes.io/projected/404f0edd-0396-444b-9c49-e2eef58db60b-kube-api-access-xlz7g\") pod \"model-serving-api-86f7b4b499-kz7zg\" (UID: \"404f0edd-0396-444b-9c49-e2eef58db60b\") " pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:43.287789 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.287752 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/060ac850-d845-4d76-ba56-b12612e9def6-cert\") pod \"odh-model-controller-696fc77849-wm5rq\" (UID: \"060ac850-d845-4d76-ba56-b12612e9def6\") " pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:43.287789 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.287793 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/404f0edd-0396-444b-9c49-e2eef58db60b-tls-certs\") pod \"model-serving-api-86f7b4b499-kz7zg\" (UID: \"404f0edd-0396-444b-9c49-e2eef58db60b\") " pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:43.287987 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.287837 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgx7l\" (UniqueName: \"kubernetes.io/projected/060ac850-d845-4d76-ba56-b12612e9def6-kube-api-access-tgx7l\") pod \"odh-model-controller-696fc77849-wm5rq\" (UID: \"060ac850-d845-4d76-ba56-b12612e9def6\") " pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:43.287987 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.287879 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlz7g\" (UniqueName: \"kubernetes.io/projected/404f0edd-0396-444b-9c49-e2eef58db60b-kube-api-access-xlz7g\") pod \"model-serving-api-86f7b4b499-kz7zg\" (UID: \"404f0edd-0396-444b-9c49-e2eef58db60b\") " pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:43.287987 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:26:43.287907 2562 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 14:26:43.287987 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:26:43.287980 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060ac850-d845-4d76-ba56-b12612e9def6-cert podName:060ac850-d845-4d76-ba56-b12612e9def6 nodeName:}" failed. No retries permitted until 2026-04-22 14:26:43.787961948 +0000 UTC m=+707.487322626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/060ac850-d845-4d76-ba56-b12612e9def6-cert") pod "odh-model-controller-696fc77849-wm5rq" (UID: "060ac850-d845-4d76-ba56-b12612e9def6") : secret "odh-model-controller-webhook-cert" not found Apr 22 14:26:43.290374 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.290347 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/404f0edd-0396-444b-9c49-e2eef58db60b-tls-certs\") pod \"model-serving-api-86f7b4b499-kz7zg\" (UID: \"404f0edd-0396-444b-9c49-e2eef58db60b\") " pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:43.299351 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.299325 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgx7l\" (UniqueName: \"kubernetes.io/projected/060ac850-d845-4d76-ba56-b12612e9def6-kube-api-access-tgx7l\") pod \"odh-model-controller-696fc77849-wm5rq\" (UID: \"060ac850-d845-4d76-ba56-b12612e9def6\") " pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:43.299500 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.299485 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlz7g\" (UniqueName: \"kubernetes.io/projected/404f0edd-0396-444b-9c49-e2eef58db60b-kube-api-access-xlz7g\") pod \"model-serving-api-86f7b4b499-kz7zg\" (UID: \"404f0edd-0396-444b-9c49-e2eef58db60b\") " pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:43.442166 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.442079 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:43.562978 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.562945 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-kz7zg"] Apr 22 14:26:43.564246 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:26:43.564216 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod404f0edd_0396_444b_9c49_e2eef58db60b.slice/crio-29806538b72dad12da830097801eb474f6ea46f01b296404f534b998f487f44d WatchSource:0}: Error finding container 29806538b72dad12da830097801eb474f6ea46f01b296404f534b998f487f44d: Status 404 returned error can't find the container with id 29806538b72dad12da830097801eb474f6ea46f01b296404f534b998f487f44d Apr 22 14:26:43.792577 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.792488 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/060ac850-d845-4d76-ba56-b12612e9def6-cert\") pod \"odh-model-controller-696fc77849-wm5rq\" (UID: \"060ac850-d845-4d76-ba56-b12612e9def6\") " pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:43.794861 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:43.794830 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/060ac850-d845-4d76-ba56-b12612e9def6-cert\") pod \"odh-model-controller-696fc77849-wm5rq\" (UID: \"060ac850-d845-4d76-ba56-b12612e9def6\") " pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:44.059398 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:44.059309 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:44.198171 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:44.198117 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-kz7zg" event={"ID":"404f0edd-0396-444b-9c49-e2eef58db60b","Type":"ContainerStarted","Data":"29806538b72dad12da830097801eb474f6ea46f01b296404f534b998f487f44d"} Apr 22 14:26:44.205870 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:44.205844 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wm5rq"] Apr 22 14:26:44.206795 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:26:44.206765 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod060ac850_d845_4d76_ba56_b12612e9def6.slice/crio-f7253fb5f7ecb9108a97401ab3f4b150187c090c95c27dfea82034e9bd30af6c WatchSource:0}: Error finding container f7253fb5f7ecb9108a97401ab3f4b150187c090c95c27dfea82034e9bd30af6c: Status 404 returned error can't find the container with id f7253fb5f7ecb9108a97401ab3f4b150187c090c95c27dfea82034e9bd30af6c Apr 22 14:26:45.203279 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:45.203238 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wm5rq" event={"ID":"060ac850-d845-4d76-ba56-b12612e9def6","Type":"ContainerStarted","Data":"f7253fb5f7ecb9108a97401ab3f4b150187c090c95c27dfea82034e9bd30af6c"} Apr 22 14:26:45.204870 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:45.204837 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-kz7zg" event={"ID":"404f0edd-0396-444b-9c49-e2eef58db60b","Type":"ContainerStarted","Data":"05fabf67ff36e463c0fec8572fbc14ba8351b9d84da9287cf84fe045cfddbb51"} Apr 22 14:26:45.204991 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:45.204890 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:45.238926 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:45.238846 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-kz7zg" podStartSLOduration=1.013012324 podStartE2EDuration="2.238790924s" podCreationTimestamp="2026-04-22 14:26:43 +0000 UTC" firstStartedPulling="2026-04-22 14:26:43.566402625 +0000 UTC m=+707.265763285" lastFinishedPulling="2026-04-22 14:26:44.792181218 +0000 UTC m=+708.491541885" observedRunningTime="2026-04-22 14:26:45.238199886 +0000 UTC m=+708.937560569" watchObservedRunningTime="2026-04-22 14:26:45.238790924 +0000 UTC m=+708.938151608" Apr 22 14:26:47.214798 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:47.214759 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wm5rq" event={"ID":"060ac850-d845-4d76-ba56-b12612e9def6","Type":"ContainerStarted","Data":"081731f35427285af4e36cdd62efc0ee1842b4f0665fd89feef36c7dff4cddb2"} Apr 22 14:26:47.215233 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:47.214896 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:47.234394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:47.234344 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-wm5rq" podStartSLOduration=1.322565056 podStartE2EDuration="4.234328337s" podCreationTimestamp="2026-04-22 14:26:43 +0000 UTC" firstStartedPulling="2026-04-22 14:26:44.209798503 +0000 UTC m=+707.909159166" lastFinishedPulling="2026-04-22 14:26:47.121561774 +0000 UTC m=+710.820922447" observedRunningTime="2026-04-22 14:26:47.232509684 +0000 UTC m=+710.931870404" watchObservedRunningTime="2026-04-22 14:26:47.234328337 +0000 UTC m=+710.933689020" Apr 22 14:26:56.213848 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:56.213811 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-kz7zg" Apr 22 14:26:58.221527 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:58.221494 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-wm5rq" Apr 22 14:26:59.033970 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:59.033935 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-x8dpq"] Apr 22 14:26:59.038239 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:59.038221 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x8dpq" Apr 22 14:26:59.049552 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:59.048199 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-x8dpq"] Apr 22 14:26:59.123465 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:59.123435 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhgpc\" (UniqueName: \"kubernetes.io/projected/f69d9140-fa39-4ab0-a5e7-0b6fb00a4787-kube-api-access-mhgpc\") pod \"s3-init-x8dpq\" (UID: \"f69d9140-fa39-4ab0-a5e7-0b6fb00a4787\") " pod="kserve/s3-init-x8dpq" Apr 22 14:26:59.224725 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:59.224686 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhgpc\" (UniqueName: \"kubernetes.io/projected/f69d9140-fa39-4ab0-a5e7-0b6fb00a4787-kube-api-access-mhgpc\") pod \"s3-init-x8dpq\" (UID: \"f69d9140-fa39-4ab0-a5e7-0b6fb00a4787\") " pod="kserve/s3-init-x8dpq" Apr 22 14:26:59.233894 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:59.233872 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhgpc\" (UniqueName: \"kubernetes.io/projected/f69d9140-fa39-4ab0-a5e7-0b6fb00a4787-kube-api-access-mhgpc\") pod \"s3-init-x8dpq\" (UID: \"f69d9140-fa39-4ab0-a5e7-0b6fb00a4787\") " pod="kserve/s3-init-x8dpq" Apr 22 14:26:59.354121 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:59.354083 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x8dpq" Apr 22 14:26:59.488741 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:26:59.488714 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-x8dpq"] Apr 22 14:26:59.489489 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:26:59.489465 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf69d9140_fa39_4ab0_a5e7_0b6fb00a4787.slice/crio-3b36c62c1600b2fd8180c038b810f59fe21ce3a27ccea3c0aae2d2faf25c678e WatchSource:0}: Error finding container 3b36c62c1600b2fd8180c038b810f59fe21ce3a27ccea3c0aae2d2faf25c678e: Status 404 returned error can't find the container with id 3b36c62c1600b2fd8180c038b810f59fe21ce3a27ccea3c0aae2d2faf25c678e Apr 22 14:27:00.267079 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:00.267026 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x8dpq" event={"ID":"f69d9140-fa39-4ab0-a5e7-0b6fb00a4787","Type":"ContainerStarted","Data":"3b36c62c1600b2fd8180c038b810f59fe21ce3a27ccea3c0aae2d2faf25c678e"} Apr 22 14:27:04.283738 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:04.283636 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x8dpq" event={"ID":"f69d9140-fa39-4ab0-a5e7-0b6fb00a4787","Type":"ContainerStarted","Data":"c8004f15bee2e184666c767067a053ed60c2499367e28f05a960877275356c10"} Apr 22 14:27:04.304989 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:04.304939 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-x8dpq" podStartSLOduration=0.817911347 podStartE2EDuration="5.304923661s" podCreationTimestamp="2026-04-22 14:26:59 +0000 UTC" firstStartedPulling="2026-04-22 14:26:59.491373696 +0000 UTC m=+723.190734371" lastFinishedPulling="2026-04-22 14:27:03.978386024 +0000 UTC m=+727.677746685" observedRunningTime="2026-04-22 14:27:04.303147811 +0000 UTC m=+728.002508504" watchObservedRunningTime="2026-04-22 14:27:04.304923661 +0000 UTC m=+728.004284342" Apr 22 14:27:07.296346 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:07.296250 2562 generic.go:358] "Generic (PLEG): container finished" podID="f69d9140-fa39-4ab0-a5e7-0b6fb00a4787" containerID="c8004f15bee2e184666c767067a053ed60c2499367e28f05a960877275356c10" exitCode=0 Apr 22 14:27:07.296346 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:07.296318 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x8dpq" event={"ID":"f69d9140-fa39-4ab0-a5e7-0b6fb00a4787","Type":"ContainerDied","Data":"c8004f15bee2e184666c767067a053ed60c2499367e28f05a960877275356c10"} Apr 22 14:27:08.421708 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:08.421683 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x8dpq" Apr 22 14:27:08.515839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:08.515804 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhgpc\" (UniqueName: \"kubernetes.io/projected/f69d9140-fa39-4ab0-a5e7-0b6fb00a4787-kube-api-access-mhgpc\") pod \"f69d9140-fa39-4ab0-a5e7-0b6fb00a4787\" (UID: \"f69d9140-fa39-4ab0-a5e7-0b6fb00a4787\") " Apr 22 14:27:08.517827 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:08.517802 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f69d9140-fa39-4ab0-a5e7-0b6fb00a4787-kube-api-access-mhgpc" (OuterVolumeSpecName: "kube-api-access-mhgpc") pod "f69d9140-fa39-4ab0-a5e7-0b6fb00a4787" (UID: "f69d9140-fa39-4ab0-a5e7-0b6fb00a4787"). InnerVolumeSpecName "kube-api-access-mhgpc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:27:08.616662 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:08.616621 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mhgpc\" (UniqueName: \"kubernetes.io/projected/f69d9140-fa39-4ab0-a5e7-0b6fb00a4787-kube-api-access-mhgpc\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:27:09.304560 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:09.304473 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x8dpq" Apr 22 14:27:09.304560 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:09.304510 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x8dpq" event={"ID":"f69d9140-fa39-4ab0-a5e7-0b6fb00a4787","Type":"ContainerDied","Data":"3b36c62c1600b2fd8180c038b810f59fe21ce3a27ccea3c0aae2d2faf25c678e"} Apr 22 14:27:09.304560 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:09.304539 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b36c62c1600b2fd8180c038b810f59fe21ce3a27ccea3c0aae2d2faf25c678e" Apr 22 14:27:19.861895 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.861859 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch"] Apr 22 14:27:19.862349 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.862182 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f69d9140-fa39-4ab0-a5e7-0b6fb00a4787" containerName="s3-init" Apr 22 14:27:19.862349 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.862192 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69d9140-fa39-4ab0-a5e7-0b6fb00a4787" containerName="s3-init" Apr 22 14:27:19.862349 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.862257 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="f69d9140-fa39-4ab0-a5e7-0b6fb00a4787" containerName="s3-init" Apr 22 14:27:19.865286 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.865258 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:19.868755 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.868726 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 14:27:19.869337 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.868986 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 14:27:19.869337 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.869186 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 22 14:27:19.869788 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.869602 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-gnh4w\"" Apr 22 14:27:19.881432 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.881401 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch"] Apr 22 14:27:19.916256 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.916229 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:19.916401 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.916275 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:19.916401 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.916325 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3083646c-cb86-490f-b489-adfa24221e89-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:19.916401 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.916343 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmw5j\" (UniqueName: \"kubernetes.io/projected/3083646c-cb86-490f-b489-adfa24221e89-kube-api-access-zmw5j\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:19.916401 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.916377 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:19.916534 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.916489 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3083646c-cb86-490f-b489-adfa24221e89-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:19.916567 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.916540 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:19.916601 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.916565 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:19.916601 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:19.916590 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3083646c-cb86-490f-b489-adfa24221e89-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.017836 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.017792 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.017857 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.017912 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3083646c-cb86-490f-b489-adfa24221e89-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.017937 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmw5j\" (UniqueName: \"kubernetes.io/projected/3083646c-cb86-490f-b489-adfa24221e89-kube-api-access-zmw5j\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.017995 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.018030 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3083646c-cb86-490f-b489-adfa24221e89-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.018062 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.018091 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.018122 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3083646c-cb86-490f-b489-adfa24221e89-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018533 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.018288 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018533 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.018302 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018533 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.018423 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.018704 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.018636 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.019171 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.019132 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3083646c-cb86-490f-b489-adfa24221e89-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.020690 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.020636 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3083646c-cb86-490f-b489-adfa24221e89-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.020923 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.020903 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3083646c-cb86-490f-b489-adfa24221e89-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.037335 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.037309 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmw5j\" (UniqueName: \"kubernetes.io/projected/3083646c-cb86-490f-b489-adfa24221e89-kube-api-access-zmw5j\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.037445 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.037379 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3083646c-cb86-490f-b489-adfa24221e89-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-k9fch\" (UID: \"3083646c-cb86-490f-b489-adfa24221e89\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.181513 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.181411 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:20.397107 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.396885 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch"] Apr 22 14:27:20.398932 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:27:20.398899 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3083646c_cb86_490f_b489_adfa24221e89.slice/crio-4874222ec080ba72c120a08cd0e1d68bb2f35764773cefa30b026cb332e8fa50 WatchSource:0}: Error finding container 4874222ec080ba72c120a08cd0e1d68bb2f35764773cefa30b026cb332e8fa50: Status 404 returned error can't find the container with id 4874222ec080ba72c120a08cd0e1d68bb2f35764773cefa30b026cb332e8fa50 Apr 22 14:27:20.401083 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.401054 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:27:20.401172 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.401128 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:27:20.401172 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:20.401159 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:27:21.353967 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:21.353931 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" event={"ID":"3083646c-cb86-490f-b489-adfa24221e89","Type":"ContainerStarted","Data":"2e07a322b51a1323fcaaeb0b1fbd79317132cf6769240fe161bdfaf1fc66b9ff"} Apr 22 14:27:21.354340 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:21.353975 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" event={"ID":"3083646c-cb86-490f-b489-adfa24221e89","Type":"ContainerStarted","Data":"4874222ec080ba72c120a08cd0e1d68bb2f35764773cefa30b026cb332e8fa50"} Apr 22 14:27:21.381263 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:21.381207 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" podStartSLOduration=2.381192358 podStartE2EDuration="2.381192358s" podCreationTimestamp="2026-04-22 14:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:27:21.37925092 +0000 UTC m=+745.078611625" watchObservedRunningTime="2026-04-22 14:27:21.381192358 +0000 UTC m=+745.080553040" Apr 22 14:27:22.182566 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:22.182508 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:22.188391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:22.188366 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:22.357369 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:22.357331 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:22.358313 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:22.358293 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-k9fch" Apr 22 14:27:35.356548 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.356511 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq"] Apr 22 14:27:35.393130 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.393098 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq"] Apr 22 14:27:35.393277 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.393238 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.397484 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.397457 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q979b\"" Apr 22 14:27:35.397623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.397457 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 14:27:35.455177 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.455142 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.455347 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.455245 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-home\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.455347 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.455291 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lllbp\" (UniqueName: \"kubernetes.io/projected/1ba9e53b-a553-48a4-85ae-febedea59c37-kube-api-access-lllbp\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.455347 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.455327 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-dshm\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.455519 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.455408 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-model-cache\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.455519 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.455440 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba9e53b-a553-48a4-85ae-febedea59c37-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.556671 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.556617 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-home\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.556671 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.556676 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lllbp\" (UniqueName: \"kubernetes.io/projected/1ba9e53b-a553-48a4-85ae-febedea59c37-kube-api-access-lllbp\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.556908 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.556703 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-dshm\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.556908 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.556731 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-model-cache\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.556908 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.556752 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba9e53b-a553-48a4-85ae-febedea59c37-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.556908 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.556786 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.557132 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.557107 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-home\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.557186 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.557154 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-model-cache\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.557223 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.557178 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.558990 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.558962 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-dshm\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.559158 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.559140 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba9e53b-a553-48a4-85ae-febedea59c37-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.565199 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.565176 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lllbp\" (UniqueName: \"kubernetes.io/projected/1ba9e53b-a553-48a4-85ae-febedea59c37-kube-api-access-lllbp\") pod \"scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.705993 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.705897 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:35.847506 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:35.847483 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq"] Apr 22 14:27:35.849774 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:27:35.849740 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ba9e53b_a553_48a4_85ae_febedea59c37.slice/crio-677fb1f1324af53cf952d732882094ca30a0375bb7f9d811f649294677f4bb7e WatchSource:0}: Error finding container 677fb1f1324af53cf952d732882094ca30a0375bb7f9d811f649294677f4bb7e: Status 404 returned error can't find the container with id 677fb1f1324af53cf952d732882094ca30a0375bb7f9d811f649294677f4bb7e Apr 22 14:27:36.409775 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:36.409733 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" event={"ID":"1ba9e53b-a553-48a4-85ae-febedea59c37","Type":"ContainerStarted","Data":"677fb1f1324af53cf952d732882094ca30a0375bb7f9d811f649294677f4bb7e"} Apr 22 14:27:40.435163 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:40.435118 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" event={"ID":"1ba9e53b-a553-48a4-85ae-febedea59c37","Type":"ContainerStarted","Data":"5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111"} Apr 22 14:27:44.452911 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:44.452869 2562 generic.go:358] "Generic (PLEG): container finished" podID="1ba9e53b-a553-48a4-85ae-febedea59c37" containerID="5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111" exitCode=0 Apr 22 14:27:44.453300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:44.452944 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" event={"ID":"1ba9e53b-a553-48a4-85ae-febedea59c37","Type":"ContainerDied","Data":"5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111"} Apr 22 14:27:46.461618 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.461580 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" event={"ID":"1ba9e53b-a553-48a4-85ae-febedea59c37","Type":"ContainerStarted","Data":"c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b"} Apr 22 14:27:46.483278 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.483224 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" podStartSLOduration=1.695418042 podStartE2EDuration="11.483209165s" podCreationTimestamp="2026-04-22 14:27:35 +0000 UTC" firstStartedPulling="2026-04-22 14:27:35.851641514 +0000 UTC m=+759.551002190" lastFinishedPulling="2026-04-22 14:27:45.639432651 +0000 UTC m=+769.338793313" observedRunningTime="2026-04-22 14:27:46.480678788 +0000 UTC m=+770.180039470" watchObservedRunningTime="2026-04-22 14:27:46.483209165 +0000 UTC m=+770.182569847" Apr 22 14:27:46.630291 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.630245 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c"] Apr 22 14:27:46.635008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.634984 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.637856 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.637832 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 22 14:27:46.657811 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.657782 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c"] Apr 22 14:27:46.790316 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.790235 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.790316 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.790275 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.790316 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.790293 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/559e50f3-5492-44f3-a546-6f5fee3a4e7b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.790606 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.790454 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5zx8\" (UniqueName: \"kubernetes.io/projected/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kube-api-access-p5zx8\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.790606 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.790503 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.790606 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.790586 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.891611 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.891564 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5zx8\" (UniqueName: \"kubernetes.io/projected/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kube-api-access-p5zx8\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.891791 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.891627 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.891791 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.891707 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.891791 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.891749 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.891791 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.891768 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.891791 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.891785 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/559e50f3-5492-44f3-a546-6f5fee3a4e7b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.892130 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.892104 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.892212 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.892200 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.892261 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.892203 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.894095 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.894073 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.894266 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.894247 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/559e50f3-5492-44f3-a546-6f5fee3a4e7b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.901275 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.901253 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5zx8\" (UniqueName: \"kubernetes.io/projected/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kube-api-access-p5zx8\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:46.952259 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:46.952228 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:47.090562 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:47.090484 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c"] Apr 22 14:27:47.090753 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:27:47.090728 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod559e50f3_5492_44f3_a546_6f5fee3a4e7b.slice/crio-ddce04dd9376d039fc62bebc6a7c647ac0b507590759768270bf3bab8f5b66fe WatchSource:0}: Error finding container ddce04dd9376d039fc62bebc6a7c647ac0b507590759768270bf3bab8f5b66fe: Status 404 returned error can't find the container with id ddce04dd9376d039fc62bebc6a7c647ac0b507590759768270bf3bab8f5b66fe Apr 22 14:27:47.467543 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:47.467434 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" event={"ID":"559e50f3-5492-44f3-a546-6f5fee3a4e7b","Type":"ContainerStarted","Data":"b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa"} Apr 22 14:27:47.467543 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:47.467481 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" event={"ID":"559e50f3-5492-44f3-a546-6f5fee3a4e7b","Type":"ContainerStarted","Data":"ddce04dd9376d039fc62bebc6a7c647ac0b507590759768270bf3bab8f5b66fe"} Apr 22 14:27:52.489993 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:52.489959 2562 generic.go:358] "Generic (PLEG): container finished" podID="559e50f3-5492-44f3-a546-6f5fee3a4e7b" containerID="b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa" exitCode=0 Apr 22 14:27:52.490388 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:52.490036 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" event={"ID":"559e50f3-5492-44f3-a546-6f5fee3a4e7b","Type":"ContainerDied","Data":"b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa"} Apr 22 14:27:53.496160 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:53.496125 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" event={"ID":"559e50f3-5492-44f3-a546-6f5fee3a4e7b","Type":"ContainerStarted","Data":"641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc"} Apr 22 14:27:53.518667 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:53.518604 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" podStartSLOduration=7.518589601 podStartE2EDuration="7.518589601s" podCreationTimestamp="2026-04-22 14:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:27:53.515940423 +0000 UTC m=+777.215301110" watchObservedRunningTime="2026-04-22 14:27:53.518589601 +0000 UTC m=+777.217950313" Apr 22 14:27:55.706826 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:55.706789 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:55.706826 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:55.706830 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:55.719481 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:55.719450 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:56.519537 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:56.519507 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:27:56.953381 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:56.953345 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:56.953784 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:56.953484 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:56.966009 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:56.965983 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:27:57.523386 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:27:57.523359 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:28:29.001518 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.001303 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq"] Apr 22 14:28:29.002077 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.001865 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" podUID="1ba9e53b-a553-48a4-85ae-febedea59c37" containerName="main" containerID="cri-o://c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b" gracePeriod=30 Apr 22 14:28:29.257449 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.257377 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:28:29.286062 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.286021 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-dshm\") pod \"1ba9e53b-a553-48a4-85ae-febedea59c37\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " Apr 22 14:28:29.286062 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.286056 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-home\") pod \"1ba9e53b-a553-48a4-85ae-febedea59c37\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " Apr 22 14:28:29.286294 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.286107 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-kserve-provision-location\") pod \"1ba9e53b-a553-48a4-85ae-febedea59c37\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " Apr 22 14:28:29.286294 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.286130 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba9e53b-a553-48a4-85ae-febedea59c37-tls-certs\") pod \"1ba9e53b-a553-48a4-85ae-febedea59c37\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " Apr 22 14:28:29.286294 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.286163 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lllbp\" (UniqueName: \"kubernetes.io/projected/1ba9e53b-a553-48a4-85ae-febedea59c37-kube-api-access-lllbp\") pod \"1ba9e53b-a553-48a4-85ae-febedea59c37\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " Apr 22 14:28:29.286294 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.286196 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-model-cache\") pod \"1ba9e53b-a553-48a4-85ae-febedea59c37\" (UID: \"1ba9e53b-a553-48a4-85ae-febedea59c37\") " Apr 22 14:28:29.286679 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.286614 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-home" (OuterVolumeSpecName: "home") pod "1ba9e53b-a553-48a4-85ae-febedea59c37" (UID: "1ba9e53b-a553-48a4-85ae-febedea59c37"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:29.286679 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.286666 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-model-cache" (OuterVolumeSpecName: "model-cache") pod "1ba9e53b-a553-48a4-85ae-febedea59c37" (UID: "1ba9e53b-a553-48a4-85ae-febedea59c37"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:29.288854 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.288827 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-dshm" (OuterVolumeSpecName: "dshm") pod "1ba9e53b-a553-48a4-85ae-febedea59c37" (UID: "1ba9e53b-a553-48a4-85ae-febedea59c37"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:29.288854 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.288836 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba9e53b-a553-48a4-85ae-febedea59c37-kube-api-access-lllbp" (OuterVolumeSpecName: "kube-api-access-lllbp") pod "1ba9e53b-a553-48a4-85ae-febedea59c37" (UID: "1ba9e53b-a553-48a4-85ae-febedea59c37"). InnerVolumeSpecName "kube-api-access-lllbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:28:29.288979 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.288912 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba9e53b-a553-48a4-85ae-febedea59c37-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1ba9e53b-a553-48a4-85ae-febedea59c37" (UID: "1ba9e53b-a553-48a4-85ae-febedea59c37"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:28:29.353115 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.353064 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1ba9e53b-a553-48a4-85ae-febedea59c37" (UID: "1ba9e53b-a553-48a4-85ae-febedea59c37"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:29.387196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.387156 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:29.387196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.387189 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba9e53b-a553-48a4-85ae-febedea59c37-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:29.387196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.387201 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lllbp\" (UniqueName: \"kubernetes.io/projected/1ba9e53b-a553-48a4-85ae-febedea59c37-kube-api-access-lllbp\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:29.387429 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.387209 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:29.387429 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.387220 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:29.387429 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.387228 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ba9e53b-a553-48a4-85ae-febedea59c37-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:29.634872 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.634831 2562 generic.go:358] "Generic (PLEG): container finished" podID="1ba9e53b-a553-48a4-85ae-febedea59c37" containerID="c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b" exitCode=0 Apr 22 14:28:29.635044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.634920 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" Apr 22 14:28:29.635044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.634916 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" event={"ID":"1ba9e53b-a553-48a4-85ae-febedea59c37","Type":"ContainerDied","Data":"c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b"} Apr 22 14:28:29.635044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.634965 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq" event={"ID":"1ba9e53b-a553-48a4-85ae-febedea59c37","Type":"ContainerDied","Data":"677fb1f1324af53cf952d732882094ca30a0375bb7f9d811f649294677f4bb7e"} Apr 22 14:28:29.635044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.634981 2562 scope.go:117] "RemoveContainer" containerID="c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b" Apr 22 14:28:29.644254 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.644237 2562 scope.go:117] "RemoveContainer" containerID="5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111" Apr 22 14:28:29.658399 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.658374 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq"] Apr 22 14:28:29.662936 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.662912 2562 scope.go:117] "RemoveContainer" containerID="c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b" Apr 22 14:28:29.663284 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:28:29.663257 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b\": container with ID starting with c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b not found: ID does not exist" containerID="c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b" Apr 22 14:28:29.663365 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.663288 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b"} err="failed to get container status \"c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b\": rpc error: code = NotFound desc = could not find container \"c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b\": container with ID starting with c4721a1eb1a1a5a4f4c70fc892e9239120481d6fd3d89eecbb8f0c99008e9a3b not found: ID does not exist" Apr 22 14:28:29.663365 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.663313 2562 scope.go:117] "RemoveContainer" containerID="5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111" Apr 22 14:28:29.663622 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:28:29.663599 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111\": container with ID starting with 5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111 not found: ID does not exist" containerID="5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111" Apr 22 14:28:29.663698 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.663641 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111"} err="failed to get container status \"5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111\": rpc error: code = NotFound desc = could not find container \"5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111\": container with ID starting with 5d1c0a3120877a4a4b4b40598604a12095124159354860da189c431c827b6111 not found: ID does not exist" Apr 22 14:28:29.664484 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:29.664466 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-58f94bc4f9-pfsbq"] Apr 22 14:28:30.926330 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:30.926294 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ba9e53b-a553-48a4-85ae-febedea59c37" path="/var/lib/kubelet/pods/1ba9e53b-a553-48a4-85ae-febedea59c37/volumes" Apr 22 14:28:52.133114 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.133061 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk"] Apr 22 14:28:52.133567 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.133551 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ba9e53b-a553-48a4-85ae-febedea59c37" containerName="storage-initializer" Apr 22 14:28:52.133567 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.133569 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba9e53b-a553-48a4-85ae-febedea59c37" containerName="storage-initializer" Apr 22 14:28:52.133641 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.133578 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ba9e53b-a553-48a4-85ae-febedea59c37" containerName="main" Apr 22 14:28:52.133641 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.133585 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba9e53b-a553-48a4-85ae-febedea59c37" containerName="main" Apr 22 14:28:52.133725 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.133641 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ba9e53b-a553-48a4-85ae-febedea59c37" containerName="main" Apr 22 14:28:52.136946 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.136930 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.139931 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.139907 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-cjn9l\"" Apr 22 14:28:52.140066 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.139950 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 14:28:52.149566 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.149539 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk"] Apr 22 14:28:52.175392 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.175360 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.175530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.175406 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.175530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.175449 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f637a04-9609-4e7f-8dea-fd300963e9fa-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.175620 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.175552 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.175620 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.175605 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2vs\" (UniqueName: \"kubernetes.io/projected/3f637a04-9609-4e7f-8dea-fd300963e9fa-kube-api-access-xn2vs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.175740 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.175672 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.276512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.276474 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f637a04-9609-4e7f-8dea-fd300963e9fa-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.276691 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.276535 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.276691 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.276564 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2vs\" (UniqueName: \"kubernetes.io/projected/3f637a04-9609-4e7f-8dea-fd300963e9fa-kube-api-access-xn2vs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.276691 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.276593 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.276691 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.276628 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.276691 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.276688 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.277138 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.277106 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.277258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.277121 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.277258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.277201 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.277258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.277201 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.279093 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.279072 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f637a04-9609-4e7f-8dea-fd300963e9fa-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.286113 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.286087 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2vs\" (UniqueName: \"kubernetes.io/projected/3f637a04-9609-4e7f-8dea-fd300963e9fa-kube-api-access-xn2vs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.447136 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.447048 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:28:52.580125 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.580098 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk"] Apr 22 14:28:52.582386 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:28:52.582355 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f637a04_9609_4e7f_8dea_fd300963e9fa.slice/crio-21d914a7f8bf214f7240e52b68277b2fe7749c3dbccc610345e28f635532b66f WatchSource:0}: Error finding container 21d914a7f8bf214f7240e52b68277b2fe7749c3dbccc610345e28f635532b66f: Status 404 returned error can't find the container with id 21d914a7f8bf214f7240e52b68277b2fe7749c3dbccc610345e28f635532b66f Apr 22 14:28:52.723566 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.723474 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" event={"ID":"3f637a04-9609-4e7f-8dea-fd300963e9fa","Type":"ContainerStarted","Data":"6b5f5842e4068982449339aee391bf8ce4b83040bb96a7ac33299b5688596609"} Apr 22 14:28:52.723566 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:52.723516 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" event={"ID":"3f637a04-9609-4e7f-8dea-fd300963e9fa","Type":"ContainerStarted","Data":"21d914a7f8bf214f7240e52b68277b2fe7749c3dbccc610345e28f635532b66f"} Apr 22 14:28:53.654469 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.654434 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c"] Apr 22 14:28:53.654903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.654738 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" podUID="559e50f3-5492-44f3-a546-6f5fee3a4e7b" containerName="main" containerID="cri-o://641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc" gracePeriod=30 Apr 22 14:28:53.729506 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.729420 2562 generic.go:358] "Generic (PLEG): container finished" podID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerID="6b5f5842e4068982449339aee391bf8ce4b83040bb96a7ac33299b5688596609" exitCode=0 Apr 22 14:28:53.729637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.729501 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" event={"ID":"3f637a04-9609-4e7f-8dea-fd300963e9fa","Type":"ContainerDied","Data":"6b5f5842e4068982449339aee391bf8ce4b83040bb96a7ac33299b5688596609"} Apr 22 14:28:53.907772 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.907749 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:28:53.994826 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.994747 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-dshm\") pod \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " Apr 22 14:28:53.995265 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.994859 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-model-cache\") pod \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " Apr 22 14:28:53.995265 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.994918 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5zx8\" (UniqueName: \"kubernetes.io/projected/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kube-api-access-p5zx8\") pod \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " Apr 22 14:28:53.995265 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.994948 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-home\") pod \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " Apr 22 14:28:53.995265 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.994976 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kserve-provision-location\") pod \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " Apr 22 14:28:53.995265 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.995020 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/559e50f3-5492-44f3-a546-6f5fee3a4e7b-tls-certs\") pod \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\" (UID: \"559e50f3-5492-44f3-a546-6f5fee3a4e7b\") " Apr 22 14:28:53.995265 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.995222 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-home" (OuterVolumeSpecName: "home") pod "559e50f3-5492-44f3-a546-6f5fee3a4e7b" (UID: "559e50f3-5492-44f3-a546-6f5fee3a4e7b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:53.995768 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.995437 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:53.995930 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.995890 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-model-cache" (OuterVolumeSpecName: "model-cache") pod "559e50f3-5492-44f3-a546-6f5fee3a4e7b" (UID: "559e50f3-5492-44f3-a546-6f5fee3a4e7b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:53.997759 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.997732 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-dshm" (OuterVolumeSpecName: "dshm") pod "559e50f3-5492-44f3-a546-6f5fee3a4e7b" (UID: "559e50f3-5492-44f3-a546-6f5fee3a4e7b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:53.997859 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.997735 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559e50f3-5492-44f3-a546-6f5fee3a4e7b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "559e50f3-5492-44f3-a546-6f5fee3a4e7b" (UID: "559e50f3-5492-44f3-a546-6f5fee3a4e7b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:28:53.998090 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:53.998063 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kube-api-access-p5zx8" (OuterVolumeSpecName: "kube-api-access-p5zx8") pod "559e50f3-5492-44f3-a546-6f5fee3a4e7b" (UID: "559e50f3-5492-44f3-a546-6f5fee3a4e7b"). InnerVolumeSpecName "kube-api-access-p5zx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:28:54.061209 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.061166 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "559e50f3-5492-44f3-a546-6f5fee3a4e7b" (UID: "559e50f3-5492-44f3-a546-6f5fee3a4e7b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:54.096740 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.096559 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p5zx8\" (UniqueName: \"kubernetes.io/projected/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kube-api-access-p5zx8\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:54.096740 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.096593 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:54.096740 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.096609 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/559e50f3-5492-44f3-a546-6f5fee3a4e7b-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:54.096740 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.096626 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:54.096740 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.096640 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/559e50f3-5492-44f3-a546-6f5fee3a4e7b-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:28:54.734728 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.734701 2562 generic.go:358] "Generic (PLEG): container finished" podID="559e50f3-5492-44f3-a546-6f5fee3a4e7b" containerID="641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc" exitCode=0 Apr 22 14:28:54.735037 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.734794 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" Apr 22 14:28:54.735037 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.734790 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" event={"ID":"559e50f3-5492-44f3-a546-6f5fee3a4e7b","Type":"ContainerDied","Data":"641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc"} Apr 22 14:28:54.735037 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.734902 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c" event={"ID":"559e50f3-5492-44f3-a546-6f5fee3a4e7b","Type":"ContainerDied","Data":"ddce04dd9376d039fc62bebc6a7c647ac0b507590759768270bf3bab8f5b66fe"} Apr 22 14:28:54.735037 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.734918 2562 scope.go:117] "RemoveContainer" containerID="641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc" Apr 22 14:28:54.744073 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.744057 2562 scope.go:117] "RemoveContainer" containerID="b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa" Apr 22 14:28:54.753594 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.753576 2562 scope.go:117] "RemoveContainer" containerID="641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc" Apr 22 14:28:54.753869 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:28:54.753850 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc\": container with ID starting with 641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc not found: ID does not exist" containerID="641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc" Apr 22 14:28:54.753932 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.753879 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc"} err="failed to get container status \"641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc\": rpc error: code = NotFound desc = could not find container \"641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc\": container with ID starting with 641657f52821eaeff3add5be20bc5c12b1296f8d9ee4d9eee731a7b3da8f0adc not found: ID does not exist" Apr 22 14:28:54.753932 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.753898 2562 scope.go:117] "RemoveContainer" containerID="b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa" Apr 22 14:28:54.754149 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:28:54.754133 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa\": container with ID starting with b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa not found: ID does not exist" containerID="b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa" Apr 22 14:28:54.754190 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.754156 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa"} err="failed to get container status \"b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa\": rpc error: code = NotFound desc = could not find container \"b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa\": container with ID starting with b1d8479130bcfbbe7e6e9211390f6e266490aeb978b791bcc16716d513b787fa not found: ID does not exist" Apr 22 14:28:54.760944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.760920 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c"] Apr 22 14:28:54.764678 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.764298 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55bf56b4c-vdw6c"] Apr 22 14:28:54.925437 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:54.925358 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559e50f3-5492-44f3-a546-6f5fee3a4e7b" path="/var/lib/kubelet/pods/559e50f3-5492-44f3-a546-6f5fee3a4e7b/volumes" Apr 22 14:28:55.742263 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:28:55.742224 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" event={"ID":"3f637a04-9609-4e7f-8dea-fd300963e9fa","Type":"ContainerStarted","Data":"286335a2aff67f9efda090fafee9cdec6cf369acf9878a6dcd4459ca2aa17ac8"} Apr 22 14:29:15.965640 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:15.965596 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg"] Apr 22 14:29:15.966248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:15.966025 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="559e50f3-5492-44f3-a546-6f5fee3a4e7b" containerName="main" Apr 22 14:29:15.966248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:15.966041 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="559e50f3-5492-44f3-a546-6f5fee3a4e7b" containerName="main" Apr 22 14:29:15.966248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:15.966051 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="559e50f3-5492-44f3-a546-6f5fee3a4e7b" containerName="storage-initializer" Apr 22 14:29:15.966248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:15.966058 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="559e50f3-5492-44f3-a546-6f5fee3a4e7b" containerName="storage-initializer" Apr 22 14:29:15.966248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:15.966118 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="559e50f3-5492-44f3-a546-6f5fee3a4e7b" containerName="main" Apr 22 14:29:16.391781 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.391744 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg"] Apr 22 14:29:16.391989 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.391900 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.394874 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.394847 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 14:29:16.510040 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.510003 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.510236 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.510062 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb45r\" (UniqueName: \"kubernetes.io/projected/2309cb6a-662b-42c8-9a4f-f780f780e03c-kube-api-access-vb45r\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.510315 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.510224 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.510315 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.510277 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.510418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.510338 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.510418 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.510371 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2309cb6a-662b-42c8-9a4f-f780f780e03c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.611288 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.611254 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.611489 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.611300 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.611489 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.611348 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.611489 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.611373 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2309cb6a-662b-42c8-9a4f-f780f780e03c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.611489 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.611411 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.611489 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.611458 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb45r\" (UniqueName: \"kubernetes.io/projected/2309cb6a-662b-42c8-9a4f-f780f780e03c-kube-api-access-vb45r\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.611839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.611814 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.611919 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.611880 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.612155 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.612123 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.614106 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.614064 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.614877 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.614852 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2309cb6a-662b-42c8-9a4f-f780f780e03c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.620569 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.620549 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb45r\" (UniqueName: \"kubernetes.io/projected/2309cb6a-662b-42c8-9a4f-f780f780e03c-kube-api-access-vb45r\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:16.704510 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:16.704423 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:29:23.561360 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:23.561326 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg"] Apr 22 14:29:23.571165 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:29:23.571131 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2309cb6a_662b_42c8_9a4f_f780f780e03c.slice/crio-3c0dabb347a22168f5b0911579c48b7c88f544d60dab035e4757235d1d10a5f2 WatchSource:0}: Error finding container 3c0dabb347a22168f5b0911579c48b7c88f544d60dab035e4757235d1d10a5f2: Status 404 returned error can't find the container with id 3c0dabb347a22168f5b0911579c48b7c88f544d60dab035e4757235d1d10a5f2 Apr 22 14:29:23.862203 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:23.862165 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" event={"ID":"3f637a04-9609-4e7f-8dea-fd300963e9fa","Type":"ContainerStarted","Data":"d1df2ebfd460cde4f4bf145f4b861d8876d548e2fe595f16c5f4d0ff02c35f0e"} Apr 22 14:29:23.862398 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:23.862312 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:29:23.863715 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:23.863686 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" event={"ID":"2309cb6a-662b-42c8-9a4f-f780f780e03c","Type":"ContainerStarted","Data":"4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d"} Apr 22 14:29:23.863832 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:23.863720 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" event={"ID":"2309cb6a-662b-42c8-9a4f-f780f780e03c","Type":"ContainerStarted","Data":"3c0dabb347a22168f5b0911579c48b7c88f544d60dab035e4757235d1d10a5f2"} Apr 22 14:29:23.902874 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:23.902813 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" podStartSLOduration=1.9417818279999999 podStartE2EDuration="31.902796587s" podCreationTimestamp="2026-04-22 14:28:52 +0000 UTC" firstStartedPulling="2026-04-22 14:28:53.730700538 +0000 UTC m=+837.430061197" lastFinishedPulling="2026-04-22 14:29:23.691715296 +0000 UTC m=+867.391075956" observedRunningTime="2026-04-22 14:29:23.900903158 +0000 UTC m=+867.600263841" watchObservedRunningTime="2026-04-22 14:29:23.902796587 +0000 UTC m=+867.602157269" Apr 22 14:29:24.873032 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:24.872994 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 14:29:28.886857 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:28.886808 2562 generic.go:358] "Generic (PLEG): container finished" podID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerID="4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d" exitCode=0 Apr 22 14:29:28.887348 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:28.886851 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" event={"ID":"2309cb6a-662b-42c8-9a4f-f780f780e03c","Type":"ContainerDied","Data":"4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d"} Apr 22 14:29:32.447643 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:32.447601 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:29:32.448138 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:32.447899 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.40:8082/healthz\": dial tcp 10.133.0.40:8082: connect: connection refused" Apr 22 14:29:32.449003 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:32.448294 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:29:32.449729 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:32.449661 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 14:29:32.922606 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:32.922551 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 14:29:38.693172 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:38.690693 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk"] Apr 22 14:29:38.693172 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:38.691123 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="main" containerID="cri-o://286335a2aff67f9efda090fafee9cdec6cf369acf9878a6dcd4459ca2aa17ac8" gracePeriod=30 Apr 22 14:29:38.693172 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:38.691576 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="tokenizer" containerID="cri-o://d1df2ebfd460cde4f4bf145f4b861d8876d548e2fe595f16c5f4d0ff02c35f0e" gracePeriod=30 Apr 22 14:29:38.694258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:38.693280 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 14:29:40.960940 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:40.960908 2562 generic.go:358] "Generic (PLEG): container finished" podID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerID="286335a2aff67f9efda090fafee9cdec6cf369acf9878a6dcd4459ca2aa17ac8" exitCode=0 Apr 22 14:29:40.961336 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:40.960982 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" event={"ID":"3f637a04-9609-4e7f-8dea-fd300963e9fa","Type":"ContainerDied","Data":"286335a2aff67f9efda090fafee9cdec6cf369acf9878a6dcd4459ca2aa17ac8"} Apr 22 14:29:41.968402 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:41.968363 2562 generic.go:358] "Generic (PLEG): container finished" podID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerID="d1df2ebfd460cde4f4bf145f4b861d8876d548e2fe595f16c5f4d0ff02c35f0e" exitCode=0 Apr 22 14:29:41.968830 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:41.968437 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" event={"ID":"3f637a04-9609-4e7f-8dea-fd300963e9fa","Type":"ContainerDied","Data":"d1df2ebfd460cde4f4bf145f4b861d8876d548e2fe595f16c5f4d0ff02c35f0e"} Apr 22 14:29:42.640036 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.640005 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:29:42.777094 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777006 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-cache\") pod \"3f637a04-9609-4e7f-8dea-fd300963e9fa\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " Apr 22 14:29:42.777094 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777079 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-tmp\") pod \"3f637a04-9609-4e7f-8dea-fd300963e9fa\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " Apr 22 14:29:42.777310 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777122 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-uds\") pod \"3f637a04-9609-4e7f-8dea-fd300963e9fa\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " Apr 22 14:29:42.777310 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777140 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-kserve-provision-location\") pod \"3f637a04-9609-4e7f-8dea-fd300963e9fa\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " Apr 22 14:29:42.777310 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777181 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn2vs\" (UniqueName: \"kubernetes.io/projected/3f637a04-9609-4e7f-8dea-fd300963e9fa-kube-api-access-xn2vs\") pod \"3f637a04-9609-4e7f-8dea-fd300963e9fa\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " Apr 22 14:29:42.777310 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777243 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f637a04-9609-4e7f-8dea-fd300963e9fa-tls-certs\") pod \"3f637a04-9609-4e7f-8dea-fd300963e9fa\" (UID: \"3f637a04-9609-4e7f-8dea-fd300963e9fa\") " Apr 22 14:29:42.777529 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777317 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3f637a04-9609-4e7f-8dea-fd300963e9fa" (UID: "3f637a04-9609-4e7f-8dea-fd300963e9fa"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:42.777529 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777414 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3f637a04-9609-4e7f-8dea-fd300963e9fa" (UID: "3f637a04-9609-4e7f-8dea-fd300963e9fa"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:42.777529 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777439 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3f637a04-9609-4e7f-8dea-fd300963e9fa" (UID: "3f637a04-9609-4e7f-8dea-fd300963e9fa"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:42.777707 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777602 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:29:42.777707 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777622 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-tmp\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:29:42.777707 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.777636 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-tokenizer-uds\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:29:42.778058 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.778026 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3f637a04-9609-4e7f-8dea-fd300963e9fa" (UID: "3f637a04-9609-4e7f-8dea-fd300963e9fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:42.779927 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.779897 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f637a04-9609-4e7f-8dea-fd300963e9fa-kube-api-access-xn2vs" (OuterVolumeSpecName: "kube-api-access-xn2vs") pod "3f637a04-9609-4e7f-8dea-fd300963e9fa" (UID: "3f637a04-9609-4e7f-8dea-fd300963e9fa"). InnerVolumeSpecName "kube-api-access-xn2vs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:29:42.780040 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.779901 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f637a04-9609-4e7f-8dea-fd300963e9fa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3f637a04-9609-4e7f-8dea-fd300963e9fa" (UID: "3f637a04-9609-4e7f-8dea-fd300963e9fa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:29:42.878440 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.878398 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xn2vs\" (UniqueName: \"kubernetes.io/projected/3f637a04-9609-4e7f-8dea-fd300963e9fa-kube-api-access-xn2vs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:29:42.878629 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.878461 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f637a04-9609-4e7f-8dea-fd300963e9fa-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:29:42.878629 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.878474 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f637a04-9609-4e7f-8dea-fd300963e9fa-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:29:42.975336 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.975287 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" event={"ID":"3f637a04-9609-4e7f-8dea-fd300963e9fa","Type":"ContainerDied","Data":"21d914a7f8bf214f7240e52b68277b2fe7749c3dbccc610345e28f635532b66f"} Apr 22 14:29:42.975782 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.975346 2562 scope.go:117] "RemoveContainer" containerID="d1df2ebfd460cde4f4bf145f4b861d8876d548e2fe595f16c5f4d0ff02c35f0e" Apr 22 14:29:42.975782 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.975302 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk" Apr 22 14:29:42.985397 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.985371 2562 scope.go:117] "RemoveContainer" containerID="286335a2aff67f9efda090fafee9cdec6cf369acf9878a6dcd4459ca2aa17ac8" Apr 22 14:29:42.994947 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:42.994921 2562 scope.go:117] "RemoveContainer" containerID="6b5f5842e4068982449339aee391bf8ce4b83040bb96a7ac33299b5688596609" Apr 22 14:29:43.001416 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:43.001376 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk"] Apr 22 14:29:43.005880 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:43.005857 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b872clk"] Apr 22 14:29:44.929514 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:44.929426 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" path="/var/lib/kubelet/pods/3f637a04-9609-4e7f-8dea-fd300963e9fa/volumes" Apr 22 14:29:52.670523 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.670471 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj"] Apr 22 14:29:52.671127 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.670993 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="storage-initializer" Apr 22 14:29:52.671127 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.671012 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="storage-initializer" Apr 22 14:29:52.671127 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.671029 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="tokenizer" Apr 22 14:29:52.671127 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.671038 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="tokenizer" Apr 22 14:29:52.671127 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.671067 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="main" Apr 22 14:29:52.671127 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.671076 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="main" Apr 22 14:29:52.671429 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.671148 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="main" Apr 22 14:29:52.671429 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.671163 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f637a04-9609-4e7f-8dea-fd300963e9fa" containerName="tokenizer" Apr 22 14:29:52.788563 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.788522 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj"] Apr 22 14:29:52.788781 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.788686 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:52.792119 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.792097 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 14:29:52.958975 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.958902 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x"] Apr 22 14:29:52.972911 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.972875 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-dshm\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:52.973120 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.973045 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-model-cache\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:52.973196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.973112 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwct\" (UniqueName: \"kubernetes.io/projected/c2f94b19-3997-4048-b2f1-e854a215470b-kube-api-access-5xwct\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:52.973252 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.973196 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:52.973309 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.973278 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2f94b19-3997-4048-b2f1-e854a215470b-tls-certs\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:52.973378 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.973335 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-home\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:52.980202 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.980176 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:52.981771 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.981742 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x"] Apr 22 14:29:52.982928 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:52.982867 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-66559\"" Apr 22 14:29:53.074482 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074448 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.074482 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074492 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.074748 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074578 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.074748 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074617 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-model-cache\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.074748 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074643 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.074748 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074688 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwct\" (UniqueName: \"kubernetes.io/projected/c2f94b19-3997-4048-b2f1-e854a215470b-kube-api-access-5xwct\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.074944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074765 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.074944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074819 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.074944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074878 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2f94b19-3997-4048-b2f1-e854a215470b-tls-certs\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.074944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074927 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-home\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.075138 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074959 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-dshm\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.075138 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.074986 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-576bx\" (UniqueName: \"kubernetes.io/projected/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kube-api-access-576bx\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.075138 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.075100 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-model-cache\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.075323 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.075301 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.075405 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.075354 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-home\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.077809 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.077722 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-dshm\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.077965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.077937 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2f94b19-3997-4048-b2f1-e854a215470b-tls-certs\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.089399 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.089374 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwct\" (UniqueName: \"kubernetes.io/projected/c2f94b19-3997-4048-b2f1-e854a215470b-kube-api-access-5xwct\") pod \"precise-prefix-cache-test-kserve-646bf96947-5wkrj\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.100790 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.100760 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:29:53.176096 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176045 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-576bx\" (UniqueName: \"kubernetes.io/projected/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kube-api-access-576bx\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.176284 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176122 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.176284 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176150 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.176284 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176203 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.176284 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176240 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.176477 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176294 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.176689 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176622 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.176689 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176622 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.176885 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176711 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.176885 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.176730 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.179436 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.179408 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.185936 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.185910 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-576bx\" (UniqueName: \"kubernetes.io/projected/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kube-api-access-576bx\") pod \"precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:53.292637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:53.292540 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:56.625330 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:56.625298 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x"] Apr 22 14:29:56.626958 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:29:56.626929 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc34abe5e_d671_417c_8bdd_1566fe4e4c83.slice/crio-6f466952836dcbde113304478fad03adcc3333a0928f9893d3887e11c3806fa4 WatchSource:0}: Error finding container 6f466952836dcbde113304478fad03adcc3333a0928f9893d3887e11c3806fa4: Status 404 returned error can't find the container with id 6f466952836dcbde113304478fad03adcc3333a0928f9893d3887e11c3806fa4 Apr 22 14:29:56.639246 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:56.639224 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj"] Apr 22 14:29:56.640399 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:29:56.640375 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f94b19_3997_4048_b2f1_e854a215470b.slice/crio-ab2b12fb5e9ed9c3eb3815375b410ad160e2b041c3b0e6c456b321ebd757df3a WatchSource:0}: Error finding container ab2b12fb5e9ed9c3eb3815375b410ad160e2b041c3b0e6c456b321ebd757df3a: Status 404 returned error can't find the container with id ab2b12fb5e9ed9c3eb3815375b410ad160e2b041c3b0e6c456b321ebd757df3a Apr 22 14:29:56.881464 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:56.881438 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:29:56.882637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:56.882617 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:29:57.043644 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:57.043608 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" event={"ID":"c34abe5e-d671-417c-8bdd-1566fe4e4c83","Type":"ContainerStarted","Data":"5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035"} Apr 22 14:29:57.043821 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:57.043716 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" event={"ID":"c34abe5e-d671-417c-8bdd-1566fe4e4c83","Type":"ContainerStarted","Data":"6f466952836dcbde113304478fad03adcc3333a0928f9893d3887e11c3806fa4"} Apr 22 14:29:57.045125 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:57.045098 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" event={"ID":"c2f94b19-3997-4048-b2f1-e854a215470b","Type":"ContainerStarted","Data":"c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d"} Apr 22 14:29:57.045223 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:57.045130 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" event={"ID":"c2f94b19-3997-4048-b2f1-e854a215470b","Type":"ContainerStarted","Data":"ab2b12fb5e9ed9c3eb3815375b410ad160e2b041c3b0e6c456b321ebd757df3a"} Apr 22 14:29:58.051675 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:58.051629 2562 generic.go:358] "Generic (PLEG): container finished" podID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerID="5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035" exitCode=0 Apr 22 14:29:58.052148 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:58.051708 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" event={"ID":"c34abe5e-d671-417c-8bdd-1566fe4e4c83","Type":"ContainerDied","Data":"5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035"} Apr 22 14:29:58.053786 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:58.053748 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" event={"ID":"2309cb6a-662b-42c8-9a4f-f780f780e03c","Type":"ContainerStarted","Data":"3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620"} Apr 22 14:29:58.101982 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:58.101911 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podStartSLOduration=15.011895161 podStartE2EDuration="43.101888965s" podCreationTimestamp="2026-04-22 14:29:15 +0000 UTC" firstStartedPulling="2026-04-22 14:29:28.888283002 +0000 UTC m=+872.587643668" lastFinishedPulling="2026-04-22 14:29:56.978276811 +0000 UTC m=+900.677637472" observedRunningTime="2026-04-22 14:29:58.096564681 +0000 UTC m=+901.795925418" watchObservedRunningTime="2026-04-22 14:29:58.101888965 +0000 UTC m=+901.801249648" Apr 22 14:29:59.059577 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:59.059534 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" event={"ID":"c34abe5e-d671-417c-8bdd-1566fe4e4c83","Type":"ContainerStarted","Data":"2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519"} Apr 22 14:29:59.059983 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:59.059586 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" event={"ID":"c34abe5e-d671-417c-8bdd-1566fe4e4c83","Type":"ContainerStarted","Data":"5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04"} Apr 22 14:29:59.059983 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:59.059625 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:29:59.084479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:29:59.084416 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" podStartSLOduration=7.084393869 podStartE2EDuration="7.084393869s" podCreationTimestamp="2026-04-22 14:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:29:59.082677971 +0000 UTC m=+902.782038656" watchObservedRunningTime="2026-04-22 14:29:59.084393869 +0000 UTC m=+902.783754552" Apr 22 14:30:02.072465 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:02.072428 2562 generic.go:358] "Generic (PLEG): container finished" podID="c2f94b19-3997-4048-b2f1-e854a215470b" containerID="c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d" exitCode=0 Apr 22 14:30:02.073084 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:02.072495 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" event={"ID":"c2f94b19-3997-4048-b2f1-e854a215470b","Type":"ContainerDied","Data":"c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d"} Apr 22 14:30:03.080381 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:03.080338 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" event={"ID":"c2f94b19-3997-4048-b2f1-e854a215470b","Type":"ContainerStarted","Data":"cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933"} Apr 22 14:30:03.101943 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:03.101885 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:30:03.101943 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:03.101950 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:30:03.107263 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:03.107203 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" podStartSLOduration=11.107184968 podStartE2EDuration="11.107184968s" podCreationTimestamp="2026-04-22 14:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:30:03.105281886 +0000 UTC m=+906.804642570" watchObservedRunningTime="2026-04-22 14:30:03.107184968 +0000 UTC m=+906.806545653" Apr 22 14:30:03.115694 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:03.115643 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:30:03.292843 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:03.292799 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:30:03.293252 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:03.293212 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:30:03.294396 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:30:03.294372 2562 logging.go:55] [core] [Channel #32 SubChannel #33]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.43:9003", ServerName: "10.133.0.43:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.43:9003: connect: connection refused" Apr 22 14:30:03.294540 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:03.294383 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.43:8082/healthz\": dial tcp 10.133.0.43:8082: connect: connection refused" Apr 22 14:30:04.098167 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:04.098138 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:30:04.293396 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:04.293342 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.43:9003\" within 1s: context deadline exceeded" Apr 22 14:30:06.705452 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:06.705415 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:30:06.705883 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:06.705467 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:30:06.707014 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:06.706984 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 22 14:30:13.294191 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:30:13.294155 2562 logging.go:55] [core] [Channel #34 SubChannel #35]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.43:9003", ServerName: "10.133.0.43:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.43:9003: connect: connection refused" Apr 22 14:30:13.295838 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:13.295815 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:30:13.297182 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:13.297163 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:30:14.294572 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:14.294511 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.43:9003\" within 1s: context deadline exceeded" Apr 22 14:30:14.294974 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:30:14.294600 2562 logging.go:55] [core] [Channel #34 SubChannel #35]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.43:9003", ServerName: "10.133.0.43:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.43:9003: connect: connection refused" Apr 22 14:30:16.705535 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:16.705485 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 22 14:30:26.705126 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:26.705063 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 22 14:30:34.128847 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:34.128812 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:30:35.286124 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.284964 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x"] Apr 22 14:30:35.286124 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.285396 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="main" containerID="cri-o://5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04" gracePeriod=30 Apr 22 14:30:35.286124 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.285826 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="tokenizer" containerID="cri-o://2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519" gracePeriod=30 Apr 22 14:30:35.288448 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.288417 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj"] Apr 22 14:30:35.289095 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.289042 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" podUID="c2f94b19-3997-4048-b2f1-e854a215470b" containerName="main" containerID="cri-o://cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933" gracePeriod=30 Apr 22 14:30:35.581094 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.581067 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:30:35.671251 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.671216 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-dshm\") pod \"c2f94b19-3997-4048-b2f1-e854a215470b\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " Apr 22 14:30:35.671455 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.671278 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xwct\" (UniqueName: \"kubernetes.io/projected/c2f94b19-3997-4048-b2f1-e854a215470b-kube-api-access-5xwct\") pod \"c2f94b19-3997-4048-b2f1-e854a215470b\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " Apr 22 14:30:35.671455 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.671315 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-model-cache\") pod \"c2f94b19-3997-4048-b2f1-e854a215470b\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " Apr 22 14:30:35.671455 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.671342 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2f94b19-3997-4048-b2f1-e854a215470b-tls-certs\") pod \"c2f94b19-3997-4048-b2f1-e854a215470b\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " Apr 22 14:30:35.671455 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.671408 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-kserve-provision-location\") pod \"c2f94b19-3997-4048-b2f1-e854a215470b\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " Apr 22 14:30:35.671455 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.671434 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-home\") pod \"c2f94b19-3997-4048-b2f1-e854a215470b\" (UID: \"c2f94b19-3997-4048-b2f1-e854a215470b\") " Apr 22 14:30:35.671765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.671674 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-model-cache" (OuterVolumeSpecName: "model-cache") pod "c2f94b19-3997-4048-b2f1-e854a215470b" (UID: "c2f94b19-3997-4048-b2f1-e854a215470b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:35.671824 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.671769 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.671933 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.671904 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-home" (OuterVolumeSpecName: "home") pod "c2f94b19-3997-4048-b2f1-e854a215470b" (UID: "c2f94b19-3997-4048-b2f1-e854a215470b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:35.673635 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.673606 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f94b19-3997-4048-b2f1-e854a215470b-kube-api-access-5xwct" (OuterVolumeSpecName: "kube-api-access-5xwct") pod "c2f94b19-3997-4048-b2f1-e854a215470b" (UID: "c2f94b19-3997-4048-b2f1-e854a215470b"). InnerVolumeSpecName "kube-api-access-5xwct". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:30:35.673937 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.673912 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-dshm" (OuterVolumeSpecName: "dshm") pod "c2f94b19-3997-4048-b2f1-e854a215470b" (UID: "c2f94b19-3997-4048-b2f1-e854a215470b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:35.674026 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.673967 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f94b19-3997-4048-b2f1-e854a215470b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c2f94b19-3997-4048-b2f1-e854a215470b" (UID: "c2f94b19-3997-4048-b2f1-e854a215470b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:30:35.737595 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.737529 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2f94b19-3997-4048-b2f1-e854a215470b" (UID: "c2f94b19-3997-4048-b2f1-e854a215470b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:35.772703 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.772643 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.772703 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.772694 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5xwct\" (UniqueName: \"kubernetes.io/projected/c2f94b19-3997-4048-b2f1-e854a215470b-kube-api-access-5xwct\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.772703 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.772707 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2f94b19-3997-4048-b2f1-e854a215470b-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.772703 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.772716 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.773054 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:35.772725 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2f94b19-3997-4048-b2f1-e854a215470b-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:36.223898 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.223864 2562 generic.go:358] "Generic (PLEG): container finished" podID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerID="5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04" exitCode=0 Apr 22 14:30:36.224091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.223940 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" event={"ID":"c34abe5e-d671-417c-8bdd-1566fe4e4c83","Type":"ContainerDied","Data":"5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04"} Apr 22 14:30:36.225609 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.225581 2562 generic.go:358] "Generic (PLEG): container finished" podID="c2f94b19-3997-4048-b2f1-e854a215470b" containerID="cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933" exitCode=0 Apr 22 14:30:36.225776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.225635 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" event={"ID":"c2f94b19-3997-4048-b2f1-e854a215470b","Type":"ContainerDied","Data":"cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933"} Apr 22 14:30:36.225776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.225685 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" event={"ID":"c2f94b19-3997-4048-b2f1-e854a215470b","Type":"ContainerDied","Data":"ab2b12fb5e9ed9c3eb3815375b410ad160e2b041c3b0e6c456b321ebd757df3a"} Apr 22 14:30:36.225776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.225689 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj" Apr 22 14:30:36.225776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.225703 2562 scope.go:117] "RemoveContainer" containerID="cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933" Apr 22 14:30:36.235947 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.235926 2562 scope.go:117] "RemoveContainer" containerID="c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d" Apr 22 14:30:36.251289 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.251248 2562 scope.go:117] "RemoveContainer" containerID="cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933" Apr 22 14:30:36.251632 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:30:36.251603 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933\": container with ID starting with cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933 not found: ID does not exist" containerID="cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933" Apr 22 14:30:36.251763 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.251644 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933"} err="failed to get container status \"cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933\": rpc error: code = NotFound desc = could not find container \"cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933\": container with ID starting with cff189126201ce0d0813bcdf5b2ce495a9b4ca38b62a4afc862d00bac05ff933 not found: ID does not exist" Apr 22 14:30:36.251763 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.251689 2562 scope.go:117] "RemoveContainer" containerID="c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d" Apr 22 14:30:36.252117 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:30:36.252096 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d\": container with ID starting with c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d not found: ID does not exist" containerID="c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d" Apr 22 14:30:36.252211 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.252122 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d"} err="failed to get container status \"c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d\": rpc error: code = NotFound desc = could not find container \"c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d\": container with ID starting with c60f4939809d75250cf74cbd6ac25a2a612e446c746d98d567f5ce3ae6c1a64d not found: ID does not exist" Apr 22 14:30:36.252211 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.252195 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj"] Apr 22 14:30:36.255932 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.255910 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-5wkrj"] Apr 22 14:30:36.705051 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.705008 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 22 14:30:36.927015 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.926978 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f94b19-3997-4048-b2f1-e854a215470b" path="/var/lib/kubelet/pods/c2f94b19-3997-4048-b2f1-e854a215470b/volumes" Apr 22 14:30:36.952896 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:36.952872 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:30:37.087900 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.087865 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-tmp\") pod \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " Apr 22 14:30:37.088078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.087945 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kserve-provision-location\") pod \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " Apr 22 14:30:37.088078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.088008 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-cache\") pod \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " Apr 22 14:30:37.088078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.088039 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-576bx\" (UniqueName: \"kubernetes.io/projected/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kube-api-access-576bx\") pod \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " Apr 22 14:30:37.088078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.088064 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-uds\") pod \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " Apr 22 14:30:37.088284 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.088114 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tls-certs\") pod \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\" (UID: \"c34abe5e-d671-417c-8bdd-1566fe4e4c83\") " Apr 22 14:30:37.088336 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.088243 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c34abe5e-d671-417c-8bdd-1566fe4e4c83" (UID: "c34abe5e-d671-417c-8bdd-1566fe4e4c83"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:37.088393 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.088341 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c34abe5e-d671-417c-8bdd-1566fe4e4c83" (UID: "c34abe5e-d671-417c-8bdd-1566fe4e4c83"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:37.088894 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.088864 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c34abe5e-d671-417c-8bdd-1566fe4e4c83" (UID: "c34abe5e-d671-417c-8bdd-1566fe4e4c83"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:37.089669 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.089184 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c34abe5e-d671-417c-8bdd-1566fe4e4c83" (UID: "c34abe5e-d671-417c-8bdd-1566fe4e4c83"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:37.089669 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.089394 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-tmp\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:37.089669 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.089419 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:37.089669 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.089435 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:37.089669 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.089450 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tokenizer-uds\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:37.090566 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.090547 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c34abe5e-d671-417c-8bdd-1566fe4e4c83" (UID: "c34abe5e-d671-417c-8bdd-1566fe4e4c83"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:30:37.090791 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.090775 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kube-api-access-576bx" (OuterVolumeSpecName: "kube-api-access-576bx") pod "c34abe5e-d671-417c-8bdd-1566fe4e4c83" (UID: "c34abe5e-d671-417c-8bdd-1566fe4e4c83"). InnerVolumeSpecName "kube-api-access-576bx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:30:37.190258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.190223 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-576bx\" (UniqueName: \"kubernetes.io/projected/c34abe5e-d671-417c-8bdd-1566fe4e4c83-kube-api-access-576bx\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:37.190258 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.190254 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c34abe5e-d671-417c-8bdd-1566fe4e4c83-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:30:37.231368 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.231336 2562 generic.go:358] "Generic (PLEG): container finished" podID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerID="2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519" exitCode=0 Apr 22 14:30:37.231540 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.231415 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" Apr 22 14:30:37.231540 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.231414 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" event={"ID":"c34abe5e-d671-417c-8bdd-1566fe4e4c83","Type":"ContainerDied","Data":"2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519"} Apr 22 14:30:37.231540 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.231459 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x" event={"ID":"c34abe5e-d671-417c-8bdd-1566fe4e4c83","Type":"ContainerDied","Data":"6f466952836dcbde113304478fad03adcc3333a0928f9893d3887e11c3806fa4"} Apr 22 14:30:37.231540 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.231483 2562 scope.go:117] "RemoveContainer" containerID="2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519" Apr 22 14:30:37.240899 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.240877 2562 scope.go:117] "RemoveContainer" containerID="5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04" Apr 22 14:30:37.248926 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.248907 2562 scope.go:117] "RemoveContainer" containerID="5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035" Apr 22 14:30:37.255421 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.255397 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x"] Apr 22 14:30:37.258947 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.258926 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-58f499f6d725x"] Apr 22 14:30:37.259029 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.258957 2562 scope.go:117] "RemoveContainer" containerID="2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519" Apr 22 14:30:37.259254 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:30:37.259232 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519\": container with ID starting with 2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519 not found: ID does not exist" containerID="2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519" Apr 22 14:30:37.259300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.259262 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519"} err="failed to get container status \"2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519\": rpc error: code = NotFound desc = could not find container \"2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519\": container with ID starting with 2fc9565acfe80fefdacea1de8a6344165fb6756f2fd838f48906305df154b519 not found: ID does not exist" Apr 22 14:30:37.259300 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.259281 2562 scope.go:117] "RemoveContainer" containerID="5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04" Apr 22 14:30:37.259535 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:30:37.259518 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04\": container with ID starting with 5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04 not found: ID does not exist" containerID="5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04" Apr 22 14:30:37.259608 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.259544 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04"} err="failed to get container status \"5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04\": rpc error: code = NotFound desc = could not find container \"5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04\": container with ID starting with 5b8cb82567ac308935c9caa4d9f2f38e75383ae17c434a1d6fd872c507131f04 not found: ID does not exist" Apr 22 14:30:37.259608 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.259571 2562 scope.go:117] "RemoveContainer" containerID="5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035" Apr 22 14:30:37.259872 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:30:37.259855 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035\": container with ID starting with 5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035 not found: ID does not exist" containerID="5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035" Apr 22 14:30:37.259914 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:37.259877 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035"} err="failed to get container status \"5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035\": rpc error: code = NotFound desc = could not find container \"5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035\": container with ID starting with 5dd97192b100781848b4fd56043c8d0f58b89e17ee726f7bf6b407f636c90035 not found: ID does not exist" Apr 22 14:30:38.929323 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:38.929282 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" path="/var/lib/kubelet/pods/c34abe5e-d671-417c-8bdd-1566fe4e4c83/volumes" Apr 22 14:30:46.705400 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:46.705354 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 22 14:30:48.350643 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.350606 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv"] Apr 22 14:30:48.351168 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351147 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="storage-initializer" Apr 22 14:30:48.351214 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351173 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="storage-initializer" Apr 22 14:30:48.351214 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351186 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="main" Apr 22 14:30:48.351214 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351195 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="main" Apr 22 14:30:48.351313 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351214 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f94b19-3997-4048-b2f1-e854a215470b" containerName="main" Apr 22 14:30:48.351313 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351223 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f94b19-3997-4048-b2f1-e854a215470b" containerName="main" Apr 22 14:30:48.351313 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351247 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="tokenizer" Apr 22 14:30:48.351313 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351255 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="tokenizer" Apr 22 14:30:48.351313 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351269 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f94b19-3997-4048-b2f1-e854a215470b" containerName="storage-initializer" Apr 22 14:30:48.351313 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351278 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f94b19-3997-4048-b2f1-e854a215470b" containerName="storage-initializer" Apr 22 14:30:48.351502 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351352 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="tokenizer" Apr 22 14:30:48.351502 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351365 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c34abe5e-d671-417c-8bdd-1566fe4e4c83" containerName="main" Apr 22 14:30:48.351502 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.351374 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2f94b19-3997-4048-b2f1-e854a215470b" containerName="main" Apr 22 14:30:48.357146 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.357121 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.359819 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.359798 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 22 14:30:48.364306 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.364174 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv"] Apr 22 14:30:48.495894 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.495855 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b22ee81d-0cfc-4580-9516-f845e71aa7fa-tls-certs\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.496105 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.495910 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kserve-provision-location\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.496105 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.496057 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-model-cache\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.496227 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.496116 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkqvz\" (UniqueName: \"kubernetes.io/projected/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kube-api-access-jkqvz\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.496227 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.496150 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-dshm\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.496227 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.496215 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-home\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.596912 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.596873 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-home\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.597104 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.596916 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b22ee81d-0cfc-4580-9516-f845e71aa7fa-tls-certs\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.597104 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.597048 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kserve-provision-location\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.597223 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.597178 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-model-cache\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.597323 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.597302 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkqvz\" (UniqueName: \"kubernetes.io/projected/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kube-api-access-jkqvz\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.597412 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.597344 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-dshm\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.597524 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.597497 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-model-cache\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.597704 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.597640 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kserve-provision-location\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.597819 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.597781 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-home\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.600222 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.599961 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b22ee81d-0cfc-4580-9516-f845e71aa7fa-tls-certs\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.600222 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.600012 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-dshm\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.606438 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.606365 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkqvz\" (UniqueName: \"kubernetes.io/projected/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kube-api-access-jkqvz\") pod \"conv-test-round-trip-kserve-6469f58555-gl6gv\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.669894 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.669854 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:48.814701 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:48.814673 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv"] Apr 22 14:30:48.816711 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:30:48.816680 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb22ee81d_0cfc_4580_9516_f845e71aa7fa.slice/crio-d6b86185cc875d4ed7fc8d8c5140e22265d8664b496ddf79c5eebcd9894c18bf WatchSource:0}: Error finding container d6b86185cc875d4ed7fc8d8c5140e22265d8664b496ddf79c5eebcd9894c18bf: Status 404 returned error can't find the container with id d6b86185cc875d4ed7fc8d8c5140e22265d8664b496ddf79c5eebcd9894c18bf Apr 22 14:30:49.282935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:49.282836 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" event={"ID":"b22ee81d-0cfc-4580-9516-f845e71aa7fa","Type":"ContainerStarted","Data":"a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938"} Apr 22 14:30:49.282935 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:49.282876 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" event={"ID":"b22ee81d-0cfc-4580-9516-f845e71aa7fa","Type":"ContainerStarted","Data":"d6b86185cc875d4ed7fc8d8c5140e22265d8664b496ddf79c5eebcd9894c18bf"} Apr 22 14:30:53.747226 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.747124 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6"] Apr 22 14:30:53.750635 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.750610 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.753409 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.753388 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 14:30:53.760599 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.760576 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6"] Apr 22 14:30:53.847791 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.847751 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-model-cache\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.847975 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.847805 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-tls-certs\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.847975 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.847903 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-dshm\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.847975 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.847932 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-home\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.848113 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.848011 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kserve-provision-location\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.848113 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.848045 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp8r9\" (UniqueName: \"kubernetes.io/projected/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kube-api-access-hp8r9\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.949067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.949030 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-dshm\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.949067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.949081 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-home\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.949336 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.949176 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kserve-provision-location\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.949497 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.949462 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hp8r9\" (UniqueName: \"kubernetes.io/projected/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kube-api-access-hp8r9\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.949745 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.949724 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-model-cache\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.949913 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.949551 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-home\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.950062 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.949932 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-tls-certs\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.950354 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.950330 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kserve-provision-location\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.950557 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.950490 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-model-cache\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.952016 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.951991 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-dshm\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.952979 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.952961 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-tls-certs\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:53.966354 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:53.966324 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp8r9\" (UniqueName: \"kubernetes.io/projected/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kube-api-access-hp8r9\") pod \"stop-feature-test-kserve-786885549d-4vzv6\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:54.062114 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:54.062022 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:30:54.196394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:54.196360 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6"] Apr 22 14:30:54.198081 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:30:54.198051 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda10bd666_4f6f_47e9_8a5b_55af8a5e1772.slice/crio-3ea4315589791637cfcdf33e4a9abd622f035063c2fd04d4135c5d8b6b4c32c9 WatchSource:0}: Error finding container 3ea4315589791637cfcdf33e4a9abd622f035063c2fd04d4135c5d8b6b4c32c9: Status 404 returned error can't find the container with id 3ea4315589791637cfcdf33e4a9abd622f035063c2fd04d4135c5d8b6b4c32c9 Apr 22 14:30:54.308560 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:54.308522 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" event={"ID":"a10bd666-4f6f-47e9-8a5b-55af8a5e1772","Type":"ContainerStarted","Data":"3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18"} Apr 22 14:30:54.308770 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:54.308588 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" event={"ID":"a10bd666-4f6f-47e9-8a5b-55af8a5e1772","Type":"ContainerStarted","Data":"3ea4315589791637cfcdf33e4a9abd622f035063c2fd04d4135c5d8b6b4c32c9"} Apr 22 14:30:54.310293 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:54.310265 2562 generic.go:358] "Generic (PLEG): container finished" podID="b22ee81d-0cfc-4580-9516-f845e71aa7fa" containerID="a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938" exitCode=0 Apr 22 14:30:54.310399 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:54.310310 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" event={"ID":"b22ee81d-0cfc-4580-9516-f845e71aa7fa","Type":"ContainerDied","Data":"a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938"} Apr 22 14:30:55.323093 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:55.323042 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" event={"ID":"b22ee81d-0cfc-4580-9516-f845e71aa7fa","Type":"ContainerStarted","Data":"ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07"} Apr 22 14:30:55.346337 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:55.346248 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" podStartSLOduration=7.346225715 podStartE2EDuration="7.346225715s" podCreationTimestamp="2026-04-22 14:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:30:55.342793649 +0000 UTC m=+959.042154342" watchObservedRunningTime="2026-04-22 14:30:55.346225715 +0000 UTC m=+959.045586401" Apr 22 14:30:56.705183 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:56.705135 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 22 14:30:57.382020 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:57.381987 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv"] Apr 22 14:30:57.382284 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:57.382258 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" podUID="b22ee81d-0cfc-4580-9516-f845e71aa7fa" containerName="main" containerID="cri-o://ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07" gracePeriod=30 Apr 22 14:30:58.670019 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:58.669977 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:30:59.341623 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:59.341583 2562 generic.go:358] "Generic (PLEG): container finished" podID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerID="3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18" exitCode=0 Apr 22 14:30:59.341920 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:30:59.341676 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" event={"ID":"a10bd666-4f6f-47e9-8a5b-55af8a5e1772","Type":"ContainerDied","Data":"3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18"} Apr 22 14:31:00.348741 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:00.348698 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" event={"ID":"a10bd666-4f6f-47e9-8a5b-55af8a5e1772","Type":"ContainerStarted","Data":"e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288"} Apr 22 14:31:00.376061 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:00.375988 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podStartSLOduration=7.375966484 podStartE2EDuration="7.375966484s" podCreationTimestamp="2026-04-22 14:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:31:00.371611498 +0000 UTC m=+964.070972193" watchObservedRunningTime="2026-04-22 14:31:00.375966484 +0000 UTC m=+964.075327166" Apr 22 14:31:04.063006 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:04.062957 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:31:04.063506 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:04.063313 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:31:04.064777 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:04.064743 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:31:06.705839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:06.705789 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 22 14:31:14.063227 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:14.063172 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:31:16.705404 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:16.705292 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 22 14:31:24.063131 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:24.063073 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:31:26.705197 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:26.705146 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 22 14:31:27.633184 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.633160 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-6469f58555-gl6gv_b22ee81d-0cfc-4580-9516-f845e71aa7fa/main/0.log" Apr 22 14:31:27.633547 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.633527 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:31:27.685142 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685055 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-dshm\") pod \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " Apr 22 14:31:27.685142 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685117 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-home\") pod \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " Apr 22 14:31:27.685358 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685145 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-model-cache\") pod \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " Apr 22 14:31:27.685358 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685189 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b22ee81d-0cfc-4580-9516-f845e71aa7fa-tls-certs\") pod \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " Apr 22 14:31:27.685358 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685264 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkqvz\" (UniqueName: \"kubernetes.io/projected/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kube-api-access-jkqvz\") pod \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " Apr 22 14:31:27.685358 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685301 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kserve-provision-location\") pod \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\" (UID: \"b22ee81d-0cfc-4580-9516-f845e71aa7fa\") " Apr 22 14:31:27.685559 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685395 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-model-cache" (OuterVolumeSpecName: "model-cache") pod "b22ee81d-0cfc-4580-9516-f845e71aa7fa" (UID: "b22ee81d-0cfc-4580-9516-f845e71aa7fa"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:27.685559 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685409 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-home" (OuterVolumeSpecName: "home") pod "b22ee81d-0cfc-4580-9516-f845e71aa7fa" (UID: "b22ee81d-0cfc-4580-9516-f845e71aa7fa"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:27.685763 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685586 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:31:27.685763 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.685604 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:31:27.687965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.687940 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-dshm" (OuterVolumeSpecName: "dshm") pod "b22ee81d-0cfc-4580-9516-f845e71aa7fa" (UID: "b22ee81d-0cfc-4580-9516-f845e71aa7fa"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:27.688073 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.687977 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b22ee81d-0cfc-4580-9516-f845e71aa7fa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b22ee81d-0cfc-4580-9516-f845e71aa7fa" (UID: "b22ee81d-0cfc-4580-9516-f845e71aa7fa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:31:27.688073 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.688023 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kube-api-access-jkqvz" (OuterVolumeSpecName: "kube-api-access-jkqvz") pod "b22ee81d-0cfc-4580-9516-f845e71aa7fa" (UID: "b22ee81d-0cfc-4580-9516-f845e71aa7fa"). InnerVolumeSpecName "kube-api-access-jkqvz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:31:27.748100 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.748046 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b22ee81d-0cfc-4580-9516-f845e71aa7fa" (UID: "b22ee81d-0cfc-4580-9516-f845e71aa7fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:27.787177 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.787138 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkqvz\" (UniqueName: \"kubernetes.io/projected/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kube-api-access-jkqvz\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:31:27.787177 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.787180 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:31:27.787361 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.787195 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b22ee81d-0cfc-4580-9516-f845e71aa7fa-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:31:27.787361 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:27.787212 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b22ee81d-0cfc-4580-9516-f845e71aa7fa-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:31:28.475196 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.475165 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-6469f58555-gl6gv_b22ee81d-0cfc-4580-9516-f845e71aa7fa/main/0.log" Apr 22 14:31:28.475577 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.475553 2562 generic.go:358] "Generic (PLEG): container finished" podID="b22ee81d-0cfc-4580-9516-f845e71aa7fa" containerID="ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07" exitCode=137 Apr 22 14:31:28.475706 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.475621 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" Apr 22 14:31:28.475706 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.475644 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" event={"ID":"b22ee81d-0cfc-4580-9516-f845e71aa7fa","Type":"ContainerDied","Data":"ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07"} Apr 22 14:31:28.475794 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.475707 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv" event={"ID":"b22ee81d-0cfc-4580-9516-f845e71aa7fa","Type":"ContainerDied","Data":"d6b86185cc875d4ed7fc8d8c5140e22265d8664b496ddf79c5eebcd9894c18bf"} Apr 22 14:31:28.475794 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.475732 2562 scope.go:117] "RemoveContainer" containerID="ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07" Apr 22 14:31:28.485472 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.485450 2562 scope.go:117] "RemoveContainer" containerID="a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938" Apr 22 14:31:28.502001 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.501965 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv"] Apr 22 14:31:28.504611 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.504585 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-6469f58555-gl6gv"] Apr 22 14:31:28.506117 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.506101 2562 scope.go:117] "RemoveContainer" containerID="ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07" Apr 22 14:31:28.506460 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:31:28.506436 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07\": container with ID starting with ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07 not found: ID does not exist" containerID="ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07" Apr 22 14:31:28.506579 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.506467 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07"} err="failed to get container status \"ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07\": rpc error: code = NotFound desc = could not find container \"ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07\": container with ID starting with ee2f6d1383f480be2b0cc110d5ee372c2f531976123ec98455ccc5cc3e672a07 not found: ID does not exist" Apr 22 14:31:28.506579 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.506486 2562 scope.go:117] "RemoveContainer" containerID="a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938" Apr 22 14:31:28.506812 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:31:28.506792 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938\": container with ID starting with a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938 not found: ID does not exist" containerID="a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938" Apr 22 14:31:28.506870 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.506815 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938"} err="failed to get container status \"a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938\": rpc error: code = NotFound desc = could not find container \"a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938\": container with ID starting with a012fbdb61c51f5f6d4eb64c0a32a5d100f05896d851c169077ebfa52fe41938 not found: ID does not exist" Apr 22 14:31:28.925859 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:28.925824 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b22ee81d-0cfc-4580-9516-f845e71aa7fa" path="/var/lib/kubelet/pods/b22ee81d-0cfc-4580-9516-f845e71aa7fa/volumes" Apr 22 14:31:34.062785 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:34.062638 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:31:36.715217 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:36.715183 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:31:36.722813 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:36.722779 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:31:43.009447 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:43.009410 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg"] Apr 22 14:31:43.010341 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:43.010303 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" containerID="cri-o://3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620" gracePeriod=30 Apr 22 14:31:44.063044 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:44.062994 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:31:54.062960 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.062911 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:31:54.760311 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.760274 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q"] Apr 22 14:31:54.760669 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.760637 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b22ee81d-0cfc-4580-9516-f845e71aa7fa" containerName="main" Apr 22 14:31:54.760669 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.760665 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b22ee81d-0cfc-4580-9516-f845e71aa7fa" containerName="main" Apr 22 14:31:54.760763 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.760677 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b22ee81d-0cfc-4580-9516-f845e71aa7fa" containerName="storage-initializer" Apr 22 14:31:54.760763 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.760683 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b22ee81d-0cfc-4580-9516-f845e71aa7fa" containerName="storage-initializer" Apr 22 14:31:54.760763 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.760746 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="b22ee81d-0cfc-4580-9516-f845e71aa7fa" containerName="main" Apr 22 14:31:54.764245 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.764229 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.767445 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.767424 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 14:31:54.774197 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.774170 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q"] Apr 22 14:31:54.838723 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.838673 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-home\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.838923 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.838759 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.838923 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.838787 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772290d8-4095-4779-8148-a7009796c2f2-tls-certs\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.838923 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.838805 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg64t\" (UniqueName: \"kubernetes.io/projected/772290d8-4095-4779-8148-a7009796c2f2-kube-api-access-wg64t\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.838923 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.838873 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-dshm\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.838923 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.838890 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-model-cache\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.939820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.939786 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-dshm\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.939820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.939823 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-model-cache\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.940139 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.939862 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-home\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.940139 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.939948 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.940139 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.939984 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772290d8-4095-4779-8148-a7009796c2f2-tls-certs\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.940139 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.940008 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wg64t\" (UniqueName: \"kubernetes.io/projected/772290d8-4095-4779-8148-a7009796c2f2-kube-api-access-wg64t\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.940354 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.940308 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-model-cache\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.940411 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.940361 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.941787 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.940667 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-home\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.942740 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.942684 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-dshm\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.943106 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.943085 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772290d8-4095-4779-8148-a7009796c2f2-tls-certs\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:54.948965 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:54.948939 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg64t\" (UniqueName: \"kubernetes.io/projected/772290d8-4095-4779-8148-a7009796c2f2-kube-api-access-wg64t\") pod \"custom-route-timeout-test-kserve-7ffd45c99d-wfq2q\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:55.077008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:55.076981 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:31:55.422596 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:55.422565 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q"] Apr 22 14:31:55.422779 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:31:55.422756 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772290d8_4095_4779_8148_a7009796c2f2.slice/crio-772934f3deb0ccdb0f324d9a6a3c2c856665e3e2f3958a87b9752a84ef91d9b1 WatchSource:0}: Error finding container 772934f3deb0ccdb0f324d9a6a3c2c856665e3e2f3958a87b9752a84ef91d9b1: Status 404 returned error can't find the container with id 772934f3deb0ccdb0f324d9a6a3c2c856665e3e2f3958a87b9752a84ef91d9b1 Apr 22 14:31:55.424687 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:55.424673 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:31:55.588783 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:55.588743 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" event={"ID":"772290d8-4095-4779-8148-a7009796c2f2","Type":"ContainerStarted","Data":"1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba"} Apr 22 14:31:55.588783 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:31:55.588791 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" event={"ID":"772290d8-4095-4779-8148-a7009796c2f2","Type":"ContainerStarted","Data":"772934f3deb0ccdb0f324d9a6a3c2c856665e3e2f3958a87b9752a84ef91d9b1"} Apr 22 14:32:00.612734 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:00.612700 2562 generic.go:358] "Generic (PLEG): container finished" podID="772290d8-4095-4779-8148-a7009796c2f2" containerID="1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba" exitCode=0 Apr 22 14:32:00.613120 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:00.612767 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" event={"ID":"772290d8-4095-4779-8148-a7009796c2f2","Type":"ContainerDied","Data":"1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba"} Apr 22 14:32:01.618686 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:01.618629 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" event={"ID":"772290d8-4095-4779-8148-a7009796c2f2","Type":"ContainerStarted","Data":"1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40"} Apr 22 14:32:01.640507 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:01.640455 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podStartSLOduration=7.640438993 podStartE2EDuration="7.640438993s" podCreationTimestamp="2026-04-22 14:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:32:01.638072301 +0000 UTC m=+1025.337432985" watchObservedRunningTime="2026-04-22 14:32:01.640438993 +0000 UTC m=+1025.339799675" Apr 22 14:32:04.062888 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:04.062843 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:32:05.077159 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:05.077114 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:32:05.077756 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:05.077173 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:32:05.079154 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:05.079125 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 14:32:13.337011 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.336984 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg_2309cb6a-662b-42c8-9a4f-f780f780e03c/main/0.log" Apr 22 14:32:13.337412 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.337358 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:32:13.410883 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.410843 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-model-cache\") pod \"2309cb6a-662b-42c8-9a4f-f780f780e03c\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " Apr 22 14:32:13.410883 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.410901 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-home\") pod \"2309cb6a-662b-42c8-9a4f-f780f780e03c\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " Apr 22 14:32:13.411158 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.410940 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb45r\" (UniqueName: \"kubernetes.io/projected/2309cb6a-662b-42c8-9a4f-f780f780e03c-kube-api-access-vb45r\") pod \"2309cb6a-662b-42c8-9a4f-f780f780e03c\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " Apr 22 14:32:13.411158 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.410990 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-dshm\") pod \"2309cb6a-662b-42c8-9a4f-f780f780e03c\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " Apr 22 14:32:13.411158 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.411040 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-kserve-provision-location\") pod \"2309cb6a-662b-42c8-9a4f-f780f780e03c\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " Apr 22 14:32:13.411158 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.411084 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-model-cache" (OuterVolumeSpecName: "model-cache") pod "2309cb6a-662b-42c8-9a4f-f780f780e03c" (UID: "2309cb6a-662b-42c8-9a4f-f780f780e03c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:32:13.411158 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.411118 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2309cb6a-662b-42c8-9a4f-f780f780e03c-tls-certs\") pod \"2309cb6a-662b-42c8-9a4f-f780f780e03c\" (UID: \"2309cb6a-662b-42c8-9a4f-f780f780e03c\") " Apr 22 14:32:13.411431 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.411305 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-home" (OuterVolumeSpecName: "home") pod "2309cb6a-662b-42c8-9a4f-f780f780e03c" (UID: "2309cb6a-662b-42c8-9a4f-f780f780e03c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:32:13.411431 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.411405 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:32:13.411431 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.411429 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:32:13.413746 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.413714 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-dshm" (OuterVolumeSpecName: "dshm") pod "2309cb6a-662b-42c8-9a4f-f780f780e03c" (UID: "2309cb6a-662b-42c8-9a4f-f780f780e03c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:32:13.413879 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.413823 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2309cb6a-662b-42c8-9a4f-f780f780e03c-kube-api-access-vb45r" (OuterVolumeSpecName: "kube-api-access-vb45r") pod "2309cb6a-662b-42c8-9a4f-f780f780e03c" (UID: "2309cb6a-662b-42c8-9a4f-f780f780e03c"). InnerVolumeSpecName "kube-api-access-vb45r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:32:13.413943 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.413886 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2309cb6a-662b-42c8-9a4f-f780f780e03c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2309cb6a-662b-42c8-9a4f-f780f780e03c" (UID: "2309cb6a-662b-42c8-9a4f-f780f780e03c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:32:13.468394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.468343 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2309cb6a-662b-42c8-9a4f-f780f780e03c" (UID: "2309cb6a-662b-42c8-9a4f-f780f780e03c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:32:13.512644 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.512609 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vb45r\" (UniqueName: \"kubernetes.io/projected/2309cb6a-662b-42c8-9a4f-f780f780e03c-kube-api-access-vb45r\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:32:13.512644 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.512640 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:32:13.512644 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.512668 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2309cb6a-662b-42c8-9a4f-f780f780e03c-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:32:13.512866 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.512681 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2309cb6a-662b-42c8-9a4f-f780f780e03c-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:32:13.670766 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.670734 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg_2309cb6a-662b-42c8-9a4f-f780f780e03c/main/0.log" Apr 22 14:32:13.671184 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.671153 2562 generic.go:358] "Generic (PLEG): container finished" podID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerID="3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620" exitCode=137 Apr 22 14:32:13.671279 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.671226 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" Apr 22 14:32:13.671279 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.671229 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" event={"ID":"2309cb6a-662b-42c8-9a4f-f780f780e03c","Type":"ContainerDied","Data":"3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620"} Apr 22 14:32:13.671279 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.671270 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg" event={"ID":"2309cb6a-662b-42c8-9a4f-f780f780e03c","Type":"ContainerDied","Data":"3c0dabb347a22168f5b0911579c48b7c88f544d60dab035e4757235d1d10a5f2"} Apr 22 14:32:13.671385 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.671291 2562 scope.go:117] "RemoveContainer" containerID="3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620" Apr 22 14:32:13.695849 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.695814 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg"] Apr 22 14:32:13.699144 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.699084 2562 scope.go:117] "RemoveContainer" containerID="4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d" Apr 22 14:32:13.700717 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.700685 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-766c677988b2kqg"] Apr 22 14:32:13.769793 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.769764 2562 scope.go:117] "RemoveContainer" containerID="3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620" Apr 22 14:32:13.770199 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:32:13.770176 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620\": container with ID starting with 3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620 not found: ID does not exist" containerID="3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620" Apr 22 14:32:13.770283 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.770214 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620"} err="failed to get container status \"3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620\": rpc error: code = NotFound desc = could not find container \"3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620\": container with ID starting with 3e4d03ab05cc2a335043f5bf2a145ba04c2f38024ccce0e61a7cba7bd6e01620 not found: ID does not exist" Apr 22 14:32:13.770283 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.770246 2562 scope.go:117] "RemoveContainer" containerID="4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d" Apr 22 14:32:13.770598 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:32:13.770579 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d\": container with ID starting with 4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d not found: ID does not exist" containerID="4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d" Apr 22 14:32:13.770697 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:13.770607 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d"} err="failed to get container status \"4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d\": rpc error: code = NotFound desc = could not find container \"4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d\": container with ID starting with 4dc7f5b2daebd3ba4ea69bf600aebaae4f4f282ac1029d3246357e33575c6e5d not found: ID does not exist" Apr 22 14:32:14.063226 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:14.063122 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:32:14.925415 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:14.925369 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" path="/var/lib/kubelet/pods/2309cb6a-662b-42c8-9a4f-f780f780e03c/volumes" Apr 22 14:32:15.077994 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:15.077946 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 14:32:24.063308 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:24.063263 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:32:25.078182 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:25.078135 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 14:32:34.063425 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:34.063373 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" probeResult="failure" output="Get \"https://10.133.0.45:8000/health\": dial tcp 10.133.0.45:8000: connect: connection refused" Apr 22 14:32:35.077723 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:35.077675 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 14:32:44.072400 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:44.072366 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:32:44.080728 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:44.080705 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:32:45.077955 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:45.077852 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 14:32:45.338718 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:45.338584 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6"] Apr 22 14:32:45.808360 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:45.807827 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" containerID="cri-o://e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288" gracePeriod=30 Apr 22 14:32:55.078349 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:32:55.078294 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 14:33:05.077474 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:05.077423 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 14:33:06.162309 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.162268 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh"] Apr 22 14:33:06.162848 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.162827 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="storage-initializer" Apr 22 14:33:06.162931 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.162850 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="storage-initializer" Apr 22 14:33:06.162931 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.162867 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" Apr 22 14:33:06.162931 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.162875 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" Apr 22 14:33:06.163085 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.162956 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="2309cb6a-662b-42c8-9a4f-f780f780e03c" containerName="main" Apr 22 14:33:06.167153 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.167121 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.175569 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.175545 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh"] Apr 22 14:33:06.190135 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.190104 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-home\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.190295 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.190141 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-dshm\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.190295 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.190185 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-tls-certs\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.190295 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.190212 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-model-cache\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.190482 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.190339 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tms5t\" (UniqueName: \"kubernetes.io/projected/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kube-api-access-tms5t\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.190482 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.190392 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kserve-provision-location\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.291502 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.291466 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-tls-certs\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.291787 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.291756 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-model-cache\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.291938 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.291923 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tms5t\" (UniqueName: \"kubernetes.io/projected/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kube-api-access-tms5t\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.292016 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.291960 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kserve-provision-location\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.292077 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.292055 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-home\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.292136 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.292079 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-dshm\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.292764 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.292509 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-model-cache\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.292764 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.292735 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-home\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.292955 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.292841 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kserve-provision-location\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.294806 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.294782 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-tls-certs\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.295041 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.295026 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-dshm\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.307371 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.307327 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tms5t\" (UniqueName: \"kubernetes.io/projected/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kube-api-access-tms5t\") pod \"stop-feature-test-kserve-786885549d-8dvrh\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.480016 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.479924 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:06.618702 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.618670 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh"] Apr 22 14:33:06.620329 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:33:06.620301 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c2d9abc_336c_4fb5_b64d_6bb3990a91b7.slice/crio-b18c5e78f929b472a68e326d7408662a5ab634c36991034d6ba8e0043458379e WatchSource:0}: Error finding container b18c5e78f929b472a68e326d7408662a5ab634c36991034d6ba8e0043458379e: Status 404 returned error can't find the container with id b18c5e78f929b472a68e326d7408662a5ab634c36991034d6ba8e0043458379e Apr 22 14:33:06.900761 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.900725 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" event={"ID":"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7","Type":"ContainerStarted","Data":"3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6"} Apr 22 14:33:06.900761 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:06.900765 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" event={"ID":"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7","Type":"ContainerStarted","Data":"b18c5e78f929b472a68e326d7408662a5ab634c36991034d6ba8e0043458379e"} Apr 22 14:33:11.923295 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:11.923262 2562 generic.go:358] "Generic (PLEG): container finished" podID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerID="3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6" exitCode=0 Apr 22 14:33:11.923687 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:11.923309 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" event={"ID":"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7","Type":"ContainerDied","Data":"3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6"} Apr 22 14:33:12.928216 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:12.928175 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" event={"ID":"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7","Type":"ContainerStarted","Data":"f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312"} Apr 22 14:33:12.951961 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:12.951905 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podStartSLOduration=6.951890756 podStartE2EDuration="6.951890756s" podCreationTimestamp="2026-04-22 14:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:33:12.949730852 +0000 UTC m=+1096.649091535" watchObservedRunningTime="2026-04-22 14:33:12.951890756 +0000 UTC m=+1096.651251437" Apr 22 14:33:15.078533 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:15.078468 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 14:33:16.202529 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.202487 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-786885549d-4vzv6_a10bd666-4f6f-47e9-8a5b-55af8a5e1772/main/0.log" Apr 22 14:33:16.202963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.202946 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:33:16.287562 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.287528 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-home\") pod \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " Apr 22 14:33:16.287776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.287634 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kserve-provision-location\") pod \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " Apr 22 14:33:16.287776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.287695 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp8r9\" (UniqueName: \"kubernetes.io/projected/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kube-api-access-hp8r9\") pod \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " Apr 22 14:33:16.287912 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.287817 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-model-cache\") pod \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " Apr 22 14:33:16.288002 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.287968 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-home" (OuterVolumeSpecName: "home") pod "a10bd666-4f6f-47e9-8a5b-55af8a5e1772" (UID: "a10bd666-4f6f-47e9-8a5b-55af8a5e1772"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:33:16.288105 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.288078 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-dshm\") pod \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " Apr 22 14:33:16.288271 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.288025 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-model-cache" (OuterVolumeSpecName: "model-cache") pod "a10bd666-4f6f-47e9-8a5b-55af8a5e1772" (UID: "a10bd666-4f6f-47e9-8a5b-55af8a5e1772"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:33:16.288394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.288251 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-tls-certs\") pod \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\" (UID: \"a10bd666-4f6f-47e9-8a5b-55af8a5e1772\") " Apr 22 14:33:16.288821 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.288797 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:33:16.288966 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.288952 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:33:16.290361 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.290334 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kube-api-access-hp8r9" (OuterVolumeSpecName: "kube-api-access-hp8r9") pod "a10bd666-4f6f-47e9-8a5b-55af8a5e1772" (UID: "a10bd666-4f6f-47e9-8a5b-55af8a5e1772"). InnerVolumeSpecName "kube-api-access-hp8r9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:33:16.290792 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.290773 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a10bd666-4f6f-47e9-8a5b-55af8a5e1772" (UID: "a10bd666-4f6f-47e9-8a5b-55af8a5e1772"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:33:16.306767 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.306739 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-dshm" (OuterVolumeSpecName: "dshm") pod "a10bd666-4f6f-47e9-8a5b-55af8a5e1772" (UID: "a10bd666-4f6f-47e9-8a5b-55af8a5e1772"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:33:16.355818 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.355781 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a10bd666-4f6f-47e9-8a5b-55af8a5e1772" (UID: "a10bd666-4f6f-47e9-8a5b-55af8a5e1772"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:33:16.390437 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.390352 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:33:16.390437 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.390385 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:33:16.390437 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.390401 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hp8r9\" (UniqueName: \"kubernetes.io/projected/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-kube-api-access-hp8r9\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:33:16.390437 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.390413 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a10bd666-4f6f-47e9-8a5b-55af8a5e1772-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:33:16.480920 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.480880 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:16.480920 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.480928 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:33:16.482471 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.482443 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:33:16.946542 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.946513 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-786885549d-4vzv6_a10bd666-4f6f-47e9-8a5b-55af8a5e1772/main/0.log" Apr 22 14:33:16.946919 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.946890 2562 generic.go:358] "Generic (PLEG): container finished" podID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerID="e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288" exitCode=137 Apr 22 14:33:16.947050 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.946952 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" Apr 22 14:33:16.947050 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.946958 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" event={"ID":"a10bd666-4f6f-47e9-8a5b-55af8a5e1772","Type":"ContainerDied","Data":"e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288"} Apr 22 14:33:16.947050 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.946993 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6" event={"ID":"a10bd666-4f6f-47e9-8a5b-55af8a5e1772","Type":"ContainerDied","Data":"3ea4315589791637cfcdf33e4a9abd622f035063c2fd04d4135c5d8b6b4c32c9"} Apr 22 14:33:16.947050 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.947015 2562 scope.go:117] "RemoveContainer" containerID="e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288" Apr 22 14:33:16.957479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.957459 2562 scope.go:117] "RemoveContainer" containerID="3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18" Apr 22 14:33:16.969826 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.969249 2562 scope.go:117] "RemoveContainer" containerID="e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288" Apr 22 14:33:16.969826 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:33:16.969690 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288\": container with ID starting with e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288 not found: ID does not exist" containerID="e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288" Apr 22 14:33:16.969826 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.969728 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288"} err="failed to get container status \"e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288\": rpc error: code = NotFound desc = could not find container \"e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288\": container with ID starting with e7c28e8a6c50e250b08c97cda67cd73b9a79149b6c0e40cea8ce5535c2efb288 not found: ID does not exist" Apr 22 14:33:16.969826 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.969755 2562 scope.go:117] "RemoveContainer" containerID="3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18" Apr 22 14:33:16.970224 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:33:16.970189 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18\": container with ID starting with 3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18 not found: ID does not exist" containerID="3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18" Apr 22 14:33:16.970281 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.970221 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18"} err="failed to get container status \"3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18\": rpc error: code = NotFound desc = could not find container \"3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18\": container with ID starting with 3ca2ec74f7c2f69819115ac81df463bdcea8724c891afeed43d5b295e182cd18 not found: ID does not exist" Apr 22 14:33:16.970906 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.970466 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6"] Apr 22 14:33:16.981885 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:16.981852 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-4vzv6"] Apr 22 14:33:18.926030 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:18.925989 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" path="/var/lib/kubelet/pods/a10bd666-4f6f-47e9-8a5b-55af8a5e1772/volumes" Apr 22 14:33:25.077948 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:25.077898 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" probeResult="failure" output="Get \"https://10.133.0.46:8000/health\": dial tcp 10.133.0.46:8000: connect: connection refused" Apr 22 14:33:26.481276 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:26.481225 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:33:35.087686 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:35.087628 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:33:35.095932 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:35.095906 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:33:36.481424 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:36.481375 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:33:40.955106 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:40.955072 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q"] Apr 22 14:33:40.955493 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:40.955368 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" containerID="cri-o://1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40" gracePeriod=30 Apr 22 14:33:46.480555 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:46.480505 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:33:56.480591 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:33:56.480549 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:34:06.481200 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:06.481155 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:34:08.175423 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.175389 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h"] Apr 22 14:34:08.175939 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.175919 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" Apr 22 14:34:08.175939 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.175941 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" Apr 22 14:34:08.176069 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.175962 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="storage-initializer" Apr 22 14:34:08.176069 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.175971 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="storage-initializer" Apr 22 14:34:08.176069 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.176054 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="a10bd666-4f6f-47e9-8a5b-55af8a5e1772" containerName="main" Apr 22 14:34:08.180858 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.180836 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.183763 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.183738 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 14:34:08.191332 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.191305 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h"] Apr 22 14:34:08.246803 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.246764 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-dshm\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.246803 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.246801 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8a6a753-2348-4c2d-9680-53a5536893e1-tls-certs\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.247025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.246833 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk54m\" (UniqueName: \"kubernetes.io/projected/a8a6a753-2348-4c2d-9680-53a5536893e1-kube-api-access-zk54m\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.247025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.246929 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-model-cache\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.247025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.246966 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-kserve-provision-location\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.247025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.246993 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-home\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.348184 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.348151 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-model-cache\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.348395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.348202 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-kserve-provision-location\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.348395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.348242 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-home\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.348395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.348269 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-dshm\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.348395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.348285 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8a6a753-2348-4c2d-9680-53a5536893e1-tls-certs\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.348395 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.348309 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk54m\" (UniqueName: \"kubernetes.io/projected/a8a6a753-2348-4c2d-9680-53a5536893e1-kube-api-access-zk54m\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.348741 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.348633 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-model-cache\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.348955 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.348929 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-kserve-provision-location\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.349066 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.349045 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-home\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.351204 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.351175 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-dshm\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.351399 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.351382 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8a6a753-2348-4c2d-9680-53a5536893e1-tls-certs\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.356631 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.356601 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk54m\" (UniqueName: \"kubernetes.io/projected/a8a6a753-2348-4c2d-9680-53a5536893e1-kube-api-access-zk54m\") pod \"router-with-refs-test-kserve-8ff8bc774-bcw5h\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.493570 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.493470 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:08.629070 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:08.629040 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h"] Apr 22 14:34:08.630866 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:34:08.630830 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a6a753_2348_4c2d_9680_53a5536893e1.slice/crio-3a77e5cc9e3e2f2165d75e5df54429b3a19689f85c3e789a9f715b99e8ab99c5 WatchSource:0}: Error finding container 3a77e5cc9e3e2f2165d75e5df54429b3a19689f85c3e789a9f715b99e8ab99c5: Status 404 returned error can't find the container with id 3a77e5cc9e3e2f2165d75e5df54429b3a19689f85c3e789a9f715b99e8ab99c5 Apr 22 14:34:09.160966 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:09.160921 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" event={"ID":"a8a6a753-2348-4c2d-9680-53a5536893e1","Type":"ContainerStarted","Data":"964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d"} Apr 22 14:34:09.161114 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:09.160973 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" event={"ID":"a8a6a753-2348-4c2d-9680-53a5536893e1","Type":"ContainerStarted","Data":"3a77e5cc9e3e2f2165d75e5df54429b3a19689f85c3e789a9f715b99e8ab99c5"} Apr 22 14:34:11.398220 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.398189 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-7ffd45c99d-wfq2q_772290d8-4095-4779-8148-a7009796c2f2/main/0.log" Apr 22 14:34:11.398666 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.398634 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:34:11.482599 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.482500 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-kserve-provision-location\") pod \"772290d8-4095-4779-8148-a7009796c2f2\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " Apr 22 14:34:11.482599 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.482572 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772290d8-4095-4779-8148-a7009796c2f2-tls-certs\") pod \"772290d8-4095-4779-8148-a7009796c2f2\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " Apr 22 14:34:11.482858 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.482625 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-model-cache\") pod \"772290d8-4095-4779-8148-a7009796c2f2\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " Apr 22 14:34:11.482858 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.482670 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-home\") pod \"772290d8-4095-4779-8148-a7009796c2f2\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " Apr 22 14:34:11.482858 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.482695 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-dshm\") pod \"772290d8-4095-4779-8148-a7009796c2f2\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " Apr 22 14:34:11.482858 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.482741 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg64t\" (UniqueName: \"kubernetes.io/projected/772290d8-4095-4779-8148-a7009796c2f2-kube-api-access-wg64t\") pod \"772290d8-4095-4779-8148-a7009796c2f2\" (UID: \"772290d8-4095-4779-8148-a7009796c2f2\") " Apr 22 14:34:11.483067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.482952 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-model-cache" (OuterVolumeSpecName: "model-cache") pod "772290d8-4095-4779-8148-a7009796c2f2" (UID: "772290d8-4095-4779-8148-a7009796c2f2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:34:11.483124 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.483071 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:34:11.483180 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.483132 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-home" (OuterVolumeSpecName: "home") pod "772290d8-4095-4779-8148-a7009796c2f2" (UID: "772290d8-4095-4779-8148-a7009796c2f2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:34:11.486117 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.485903 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772290d8-4095-4779-8148-a7009796c2f2-kube-api-access-wg64t" (OuterVolumeSpecName: "kube-api-access-wg64t") pod "772290d8-4095-4779-8148-a7009796c2f2" (UID: "772290d8-4095-4779-8148-a7009796c2f2"). InnerVolumeSpecName "kube-api-access-wg64t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:34:11.486117 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.486070 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-dshm" (OuterVolumeSpecName: "dshm") pod "772290d8-4095-4779-8148-a7009796c2f2" (UID: "772290d8-4095-4779-8148-a7009796c2f2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:34:11.486321 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.486119 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772290d8-4095-4779-8148-a7009796c2f2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "772290d8-4095-4779-8148-a7009796c2f2" (UID: "772290d8-4095-4779-8148-a7009796c2f2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:34:11.546941 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.546879 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "772290d8-4095-4779-8148-a7009796c2f2" (UID: "772290d8-4095-4779-8148-a7009796c2f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:34:11.584294 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.584215 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:34:11.584294 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.584289 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:34:11.584294 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.584300 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wg64t\" (UniqueName: \"kubernetes.io/projected/772290d8-4095-4779-8148-a7009796c2f2-kube-api-access-wg64t\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:34:11.584622 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.584311 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/772290d8-4095-4779-8148-a7009796c2f2-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:34:11.584622 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:11.584321 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/772290d8-4095-4779-8148-a7009796c2f2-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:34:12.181722 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.181692 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-7ffd45c99d-wfq2q_772290d8-4095-4779-8148-a7009796c2f2/main/0.log" Apr 22 14:34:12.182105 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.182078 2562 generic.go:358] "Generic (PLEG): container finished" podID="772290d8-4095-4779-8148-a7009796c2f2" containerID="1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40" exitCode=137 Apr 22 14:34:12.182203 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.182174 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" Apr 22 14:34:12.182272 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.182166 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" event={"ID":"772290d8-4095-4779-8148-a7009796c2f2","Type":"ContainerDied","Data":"1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40"} Apr 22 14:34:12.182328 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.182295 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q" event={"ID":"772290d8-4095-4779-8148-a7009796c2f2","Type":"ContainerDied","Data":"772934f3deb0ccdb0f324d9a6a3c2c856665e3e2f3958a87b9752a84ef91d9b1"} Apr 22 14:34:12.182328 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.182318 2562 scope.go:117] "RemoveContainer" containerID="1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40" Apr 22 14:34:12.211015 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.210988 2562 scope.go:117] "RemoveContainer" containerID="1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba" Apr 22 14:34:12.211297 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.211276 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q"] Apr 22 14:34:12.216500 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.216475 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7ffd45c99d-wfq2q"] Apr 22 14:34:12.288459 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.288415 2562 scope.go:117] "RemoveContainer" containerID="1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40" Apr 22 14:34:12.288851 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:34:12.288820 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40\": container with ID starting with 1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40 not found: ID does not exist" containerID="1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40" Apr 22 14:34:12.288961 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.288862 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40"} err="failed to get container status \"1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40\": rpc error: code = NotFound desc = could not find container \"1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40\": container with ID starting with 1923303ed16e582328c6ec15eee94748cefa74203ce5bd86deecb3ecc3910a40 not found: ID does not exist" Apr 22 14:34:12.288961 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.288892 2562 scope.go:117] "RemoveContainer" containerID="1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba" Apr 22 14:34:12.289244 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:34:12.289214 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba\": container with ID starting with 1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba not found: ID does not exist" containerID="1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba" Apr 22 14:34:12.289307 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.289257 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba"} err="failed to get container status \"1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba\": rpc error: code = NotFound desc = could not find container \"1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba\": container with ID starting with 1b70319fd6e6db6fba86c856f1357da185767a4a020afa03862f8f10288a67ba not found: ID does not exist" Apr 22 14:34:12.926716 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:12.926685 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="772290d8-4095-4779-8148-a7009796c2f2" path="/var/lib/kubelet/pods/772290d8-4095-4779-8148-a7009796c2f2/volumes" Apr 22 14:34:13.187626 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:13.187539 2562 generic.go:358] "Generic (PLEG): container finished" podID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerID="964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d" exitCode=0 Apr 22 14:34:13.187626 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:13.187579 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" event={"ID":"a8a6a753-2348-4c2d-9680-53a5536893e1","Type":"ContainerDied","Data":"964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d"} Apr 22 14:34:14.195958 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:14.195922 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" event={"ID":"a8a6a753-2348-4c2d-9680-53a5536893e1","Type":"ContainerStarted","Data":"7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2"} Apr 22 14:34:14.227891 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:14.227798 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podStartSLOduration=6.227777663 podStartE2EDuration="6.227777663s" podCreationTimestamp="2026-04-22 14:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:34:14.222121426 +0000 UTC m=+1157.921482108" watchObservedRunningTime="2026-04-22 14:34:14.227777663 +0000 UTC m=+1157.927138346" Apr 22 14:34:16.481224 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:16.481128 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:34:18.493773 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:18.493731 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:18.493773 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:18.493780 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:34:18.495576 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:18.495539 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 22 14:34:26.481295 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:26.481242 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:34:28.494577 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:28.494528 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 22 14:34:36.480982 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:36.480926 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:34:38.494205 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:38.494154 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 22 14:34:46.480663 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:46.480605 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" probeResult="failure" output="Get \"https://10.133.0.47:8000/health\": dial tcp 10.133.0.47:8000: connect: connection refused" Apr 22 14:34:48.494953 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:48.494908 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 22 14:34:56.491168 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:56.491125 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:34:56.499235 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:56.499204 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:34:56.913817 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:56.913791 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:34:56.916042 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:56.916019 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:34:57.506072 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:57.506032 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh"] Apr 22 14:34:58.381890 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:58.381820 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" containerID="cri-o://f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312" gracePeriod=30 Apr 22 14:34:58.494830 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:34:58.494785 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 22 14:35:08.494030 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:08.493983 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 22 14:35:18.494228 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:18.494177 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 22 14:35:28.494355 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.494316 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 22 14:35:28.664829 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.664801 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-786885549d-8dvrh_5c2d9abc-336c-4fb5-b64d-6bb3990a91b7/main/0.log" Apr 22 14:35:28.665193 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.665176 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:35:28.698051 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.698019 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tms5t\" (UniqueName: \"kubernetes.io/projected/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kube-api-access-tms5t\") pod \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " Apr 22 14:35:28.698239 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.698064 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-model-cache\") pod \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " Apr 22 14:35:28.698239 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.698087 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kserve-provision-location\") pod \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " Apr 22 14:35:28.698239 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.698110 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-home\") pod \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " Apr 22 14:35:28.698239 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.698148 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-dshm\") pod \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " Apr 22 14:35:28.698239 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.698227 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-tls-certs\") pod \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\" (UID: \"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7\") " Apr 22 14:35:28.698527 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.698311 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-model-cache" (OuterVolumeSpecName: "model-cache") pod "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" (UID: "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:35:28.698527 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.698496 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:35:28.698719 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.698682 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-home" (OuterVolumeSpecName: "home") pod "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" (UID: "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:35:28.700954 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.700911 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-dshm" (OuterVolumeSpecName: "dshm") pod "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" (UID: "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:35:28.701078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.700952 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kube-api-access-tms5t" (OuterVolumeSpecName: "kube-api-access-tms5t") pod "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" (UID: "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7"). InnerVolumeSpecName "kube-api-access-tms5t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:35:28.701078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.701046 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" (UID: "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:35:28.758149 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.758089 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" (UID: "5c2d9abc-336c-4fb5-b64d-6bb3990a91b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:35:28.799576 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.799546 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:35:28.799576 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.799574 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:35:28.799765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.799584 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:35:28.799765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.799595 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:35:28.799765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:28.799605 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tms5t\" (UniqueName: \"kubernetes.io/projected/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7-kube-api-access-tms5t\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:35:29.504869 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.504836 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-786885549d-8dvrh_5c2d9abc-336c-4fb5-b64d-6bb3990a91b7/main/0.log" Apr 22 14:35:29.505275 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.505221 2562 generic.go:358] "Generic (PLEG): container finished" podID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerID="f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312" exitCode=137 Apr 22 14:35:29.505345 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.505304 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" event={"ID":"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7","Type":"ContainerDied","Data":"f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312"} Apr 22 14:35:29.505401 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.505351 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" event={"ID":"5c2d9abc-336c-4fb5-b64d-6bb3990a91b7","Type":"ContainerDied","Data":"b18c5e78f929b472a68e326d7408662a5ab634c36991034d6ba8e0043458379e"} Apr 22 14:35:29.505401 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.505374 2562 scope.go:117] "RemoveContainer" containerID="f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312" Apr 22 14:35:29.505497 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.505320 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh" Apr 22 14:35:29.525100 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.525030 2562 scope.go:117] "RemoveContainer" containerID="3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6" Apr 22 14:35:29.526581 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.526559 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh"] Apr 22 14:35:29.532199 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.532171 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-786885549d-8dvrh"] Apr 22 14:35:29.587402 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.587374 2562 scope.go:117] "RemoveContainer" containerID="f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312" Apr 22 14:35:29.587775 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:35:29.587743 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312\": container with ID starting with f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312 not found: ID does not exist" containerID="f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312" Apr 22 14:35:29.587903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.587785 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312"} err="failed to get container status \"f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312\": rpc error: code = NotFound desc = could not find container \"f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312\": container with ID starting with f6a1a607d5e1c35d668f7b54dc409f66efffbe70ce0ac351da041fd7b2f39312 not found: ID does not exist" Apr 22 14:35:29.587903 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.587806 2562 scope.go:117] "RemoveContainer" containerID="3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6" Apr 22 14:35:29.588078 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:35:29.588064 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6\": container with ID starting with 3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6 not found: ID does not exist" containerID="3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6" Apr 22 14:35:29.588132 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:29.588081 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6"} err="failed to get container status \"3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6\": rpc error: code = NotFound desc = could not find container \"3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6\": container with ID starting with 3c33e111d3788e9aee7dbdaa6a8856aef36bb831e53bd5bfa9219cf3ebf672c6 not found: ID does not exist" Apr 22 14:35:30.926085 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:30.926049 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" path="/var/lib/kubelet/pods/5c2d9abc-336c-4fb5-b64d-6bb3990a91b7/volumes" Apr 22 14:35:35.056052 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.056014 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-7x2qw"] Apr 22 14:35:35.058835 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.056333 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" podUID="2d4194bf-7633-499f-b5e0-b4a3418f143e" containerName="manager" containerID="cri-o://0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309" gracePeriod=30 Apr 22 14:35:35.303370 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.303343 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:35:35.357381 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.357348 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2mck\" (UniqueName: \"kubernetes.io/projected/2d4194bf-7633-499f-b5e0-b4a3418f143e-kube-api-access-j2mck\") pod \"2d4194bf-7633-499f-b5e0-b4a3418f143e\" (UID: \"2d4194bf-7633-499f-b5e0-b4a3418f143e\") " Apr 22 14:35:35.357570 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.357430 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d4194bf-7633-499f-b5e0-b4a3418f143e-cert\") pod \"2d4194bf-7633-499f-b5e0-b4a3418f143e\" (UID: \"2d4194bf-7633-499f-b5e0-b4a3418f143e\") " Apr 22 14:35:35.359518 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.359489 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d4194bf-7633-499f-b5e0-b4a3418f143e-kube-api-access-j2mck" (OuterVolumeSpecName: "kube-api-access-j2mck") pod "2d4194bf-7633-499f-b5e0-b4a3418f143e" (UID: "2d4194bf-7633-499f-b5e0-b4a3418f143e"). InnerVolumeSpecName "kube-api-access-j2mck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:35:35.359684 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.359579 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d4194bf-7633-499f-b5e0-b4a3418f143e-cert" (OuterVolumeSpecName: "cert") pod "2d4194bf-7633-499f-b5e0-b4a3418f143e" (UID: "2d4194bf-7633-499f-b5e0-b4a3418f143e"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:35:35.458223 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.458168 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j2mck\" (UniqueName: \"kubernetes.io/projected/2d4194bf-7633-499f-b5e0-b4a3418f143e-kube-api-access-j2mck\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:35:35.458223 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.458220 2562 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d4194bf-7633-499f-b5e0-b4a3418f143e-cert\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:35:35.528046 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.528014 2562 generic.go:358] "Generic (PLEG): container finished" podID="2d4194bf-7633-499f-b5e0-b4a3418f143e" containerID="0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309" exitCode=0 Apr 22 14:35:35.528188 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.528081 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" event={"ID":"2d4194bf-7633-499f-b5e0-b4a3418f143e","Type":"ContainerDied","Data":"0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309"} Apr 22 14:35:35.528188 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.528109 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" event={"ID":"2d4194bf-7633-499f-b5e0-b4a3418f143e","Type":"ContainerDied","Data":"ccf778a53289e53e7c38849f2e7ec39f4c4bbf8db7b51d11d5647294a4253148"} Apr 22 14:35:35.528188 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.528125 2562 scope.go:117] "RemoveContainer" containerID="0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309" Apr 22 14:35:35.528188 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.528134 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6f8c758999-7x2qw" Apr 22 14:35:35.537239 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.537189 2562 scope.go:117] "RemoveContainer" containerID="0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309" Apr 22 14:35:35.537498 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:35:35.537477 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309\": container with ID starting with 0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309 not found: ID does not exist" containerID="0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309" Apr 22 14:35:35.537579 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.537506 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309"} err="failed to get container status \"0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309\": rpc error: code = NotFound desc = could not find container \"0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309\": container with ID starting with 0601053b3b18657110cf13720001c96a58580a2b338460d24fc0e2e62edbd309 not found: ID does not exist" Apr 22 14:35:35.553484 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.553454 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-7x2qw"] Apr 22 14:35:35.556759 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:35.556734 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-6f8c758999-7x2qw"] Apr 22 14:35:36.925989 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:36.925943 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d4194bf-7633-499f-b5e0-b4a3418f143e" path="/var/lib/kubelet/pods/2d4194bf-7633-499f-b5e0-b4a3418f143e/volumes" Apr 22 14:35:38.494210 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:38.494166 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" probeResult="failure" output="Get \"https://10.133.0.48:8000/health\": dial tcp 10.133.0.48:8000: connect: connection refused" Apr 22 14:35:48.504661 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:48.504560 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:35:48.512565 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:35:48.512540 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:36:00.288812 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:00.288781 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h"] Apr 22 14:36:00.289352 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:00.289139 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" containerID="cri-o://7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2" gracePeriod=30 Apr 22 14:36:16.112484 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112450 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f"] Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112803 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="storage-initializer" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112815 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="storage-initializer" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112829 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d4194bf-7633-499f-b5e0-b4a3418f143e" containerName="manager" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112834 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4194bf-7633-499f-b5e0-b4a3418f143e" containerName="manager" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112849 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112854 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112862 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="storage-initializer" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112867 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="storage-initializer" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112875 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112880 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112926 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d4194bf-7633-499f-b5e0-b4a3418f143e" containerName="manager" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112935 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c2d9abc-336c-4fb5-b64d-6bb3990a91b7" containerName="main" Apr 22 14:36:16.112963 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.112942 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="772290d8-4095-4779-8148-a7009796c2f2" containerName="main" Apr 22 14:36:16.117402 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.117374 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.120111 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.120091 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 14:36:16.120328 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.120314 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-chfd9\"" Apr 22 14:36:16.130445 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.130421 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f"] Apr 22 14:36:16.146850 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.146709 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx"] Apr 22 14:36:16.151128 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.151099 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.162991 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.162964 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx"] Apr 22 14:36:16.212872 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.212828 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.213045 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.212900 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.213045 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.212939 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.213045 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.212968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.213045 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.212992 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz788\" (UniqueName: \"kubernetes.io/projected/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kube-api-access-tz788\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.213191 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.213105 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.314141 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314105 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.314336 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314149 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5xjf\" (UniqueName: \"kubernetes.io/projected/1208001a-27e8-45e7-8213-f205b5eb60ec-kube-api-access-g5xjf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.314336 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314185 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz788\" (UniqueName: \"kubernetes.io/projected/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kube-api-access-tz788\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.314336 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314289 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1208001a-27e8-45e7-8213-f205b5eb60ec-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.314336 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314323 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.314571 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314364 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.314571 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314407 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.314571 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314429 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.314571 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314449 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.314571 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314471 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.314571 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314485 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.314571 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314508 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.314571 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314556 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.314989 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314769 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.314989 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.314881 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.316580 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.316562 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.316769 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.316754 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.324509 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.324488 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz788\" (UniqueName: \"kubernetes.io/projected/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kube-api-access-tz788\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.415781 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.415686 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1208001a-27e8-45e7-8213-f205b5eb60ec-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.415781 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.415734 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.416010 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.415924 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.416010 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.415976 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.416119 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.416022 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.416119 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.416054 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5xjf\" (UniqueName: \"kubernetes.io/projected/1208001a-27e8-45e7-8213-f205b5eb60ec-kube-api-access-g5xjf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.416278 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.416252 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.416479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.416402 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.416617 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.416565 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.418264 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.418238 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.418421 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.418405 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1208001a-27e8-45e7-8213-f205b5eb60ec-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.425220 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.425199 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5xjf\" (UniqueName: \"kubernetes.io/projected/1208001a-27e8-45e7-8213-f205b5eb60ec-kube-api-access-g5xjf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.427147 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.427126 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:16.464692 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.464641 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:16.572260 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.572230 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f"] Apr 22 14:36:16.574545 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:36:16.574506 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb4a4c2_6e6e_424c_b8b9_f1eb0491555b.slice/crio-d231c6887a533120dd71657cd352964bc51fc16ec41457d8ce1e2977e5ce210b WatchSource:0}: Error finding container d231c6887a533120dd71657cd352964bc51fc16ec41457d8ce1e2977e5ce210b: Status 404 returned error can't find the container with id d231c6887a533120dd71657cd352964bc51fc16ec41457d8ce1e2977e5ce210b Apr 22 14:36:16.615386 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.615356 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx"] Apr 22 14:36:16.616055 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:36:16.616025 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1208001a_27e8_45e7_8213_f205b5eb60ec.slice/crio-007b24f4d20339611a8feb47c6ea57bccfc1d19b069b46f71ec39c891d3bfa6d WatchSource:0}: Error finding container 007b24f4d20339611a8feb47c6ea57bccfc1d19b069b46f71ec39c891d3bfa6d: Status 404 returned error can't find the container with id 007b24f4d20339611a8feb47c6ea57bccfc1d19b069b46f71ec39c891d3bfa6d Apr 22 14:36:16.688427 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.688386 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" event={"ID":"1208001a-27e8-45e7-8213-f205b5eb60ec","Type":"ContainerStarted","Data":"169751c881de610d85657e5aa14a0c9fe277c85887cc12dde98acc915739e9e6"} Apr 22 14:36:16.688427 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.688435 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" event={"ID":"1208001a-27e8-45e7-8213-f205b5eb60ec","Type":"ContainerStarted","Data":"007b24f4d20339611a8feb47c6ea57bccfc1d19b069b46f71ec39c891d3bfa6d"} Apr 22 14:36:16.689607 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:16.689570 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" event={"ID":"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b","Type":"ContainerStarted","Data":"d231c6887a533120dd71657cd352964bc51fc16ec41457d8ce1e2977e5ce210b"} Apr 22 14:36:18.717756 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:18.717714 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" event={"ID":"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b","Type":"ContainerStarted","Data":"2178ad3b78fbe363127dc4af327cf12b568675e38fcff857106729f4767d37b7"} Apr 22 14:36:18.718296 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:18.718286 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:19.723917 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:19.723870 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" event={"ID":"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b","Type":"ContainerStarted","Data":"419bde872344537e5b20df4e4092d9472184b84c18cb1b654eb904d12681f641"} Apr 22 14:36:21.733411 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:21.733373 2562 generic.go:358] "Generic (PLEG): container finished" podID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerID="169751c881de610d85657e5aa14a0c9fe277c85887cc12dde98acc915739e9e6" exitCode=0 Apr 22 14:36:21.733802 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:21.733456 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" event={"ID":"1208001a-27e8-45e7-8213-f205b5eb60ec","Type":"ContainerDied","Data":"169751c881de610d85657e5aa14a0c9fe277c85887cc12dde98acc915739e9e6"} Apr 22 14:36:22.739342 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:22.739297 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" event={"ID":"1208001a-27e8-45e7-8213-f205b5eb60ec","Type":"ContainerStarted","Data":"c4393ed6d5f21ab2a393b6e25ce197ed07c23af0394ac35c20298cffeffe1837"} Apr 22 14:36:22.766345 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:22.766280 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podStartSLOduration=6.766258449 podStartE2EDuration="6.766258449s" podCreationTimestamp="2026-04-22 14:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:36:22.762550851 +0000 UTC m=+1286.461911536" watchObservedRunningTime="2026-04-22 14:36:22.766258449 +0000 UTC m=+1286.465619145" Apr 22 14:36:23.746472 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:23.746436 2562 generic.go:358] "Generic (PLEG): container finished" podID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerID="419bde872344537e5b20df4e4092d9472184b84c18cb1b654eb904d12681f641" exitCode=0 Apr 22 14:36:23.746969 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:23.746487 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" event={"ID":"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b","Type":"ContainerDied","Data":"419bde872344537e5b20df4e4092d9472184b84c18cb1b654eb904d12681f641"} Apr 22 14:36:24.752277 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:24.752239 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" event={"ID":"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b","Type":"ContainerStarted","Data":"0bc9b90fb78448d24e4128fc787b0b2f5fb0a3d7af02bcaefadbb0f32b71d1fe"} Apr 22 14:36:24.778306 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:24.775491 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podStartSLOduration=7.694487882 podStartE2EDuration="8.775468871s" podCreationTimestamp="2026-04-22 14:36:16 +0000 UTC" firstStartedPulling="2026-04-22 14:36:16.576933307 +0000 UTC m=+1280.276293971" lastFinishedPulling="2026-04-22 14:36:17.657914294 +0000 UTC m=+1281.357274960" observedRunningTime="2026-04-22 14:36:24.77387955 +0000 UTC m=+1288.473240234" watchObservedRunningTime="2026-04-22 14:36:24.775468871 +0000 UTC m=+1288.474829555" Apr 22 14:36:26.427474 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:26.427442 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:26.427474 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:26.427476 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:26.429168 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:26.429129 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:36:26.465309 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:26.465271 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:26.465309 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:26.465323 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:36:26.466915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:26.466869 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:36:27.890358 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.890317 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc"] Apr 22 14:36:27.898742 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.896599 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:27.899536 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.899507 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 22 14:36:27.904560 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.904530 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc"] Apr 22 14:36:27.933427 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.933359 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:27.933598 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.933433 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:27.933598 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.933478 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c66d796b-2624-4865-9992-c1963e92fdab-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:27.933598 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.933520 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmpfr\" (UniqueName: \"kubernetes.io/projected/c66d796b-2624-4865-9992-c1963e92fdab-kube-api-access-cmpfr\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:27.933744 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.933672 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:27.933744 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:27.933718 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.034228 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.034191 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.034228 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.034232 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.034486 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.034379 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c66d796b-2624-4865-9992-c1963e92fdab-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.034486 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.034447 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmpfr\" (UniqueName: \"kubernetes.io/projected/c66d796b-2624-4865-9992-c1963e92fdab-kube-api-access-cmpfr\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.034612 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.034569 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.034691 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.034601 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.034691 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.034676 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.035035 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.035009 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.035212 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.035035 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.036869 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.036844 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.037123 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.037100 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c66d796b-2624-4865-9992-c1963e92fdab-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.042230 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.042200 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmpfr\" (UniqueName: \"kubernetes.io/projected/c66d796b-2624-4865-9992-c1963e92fdab-kube-api-access-cmpfr\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.211494 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.211379 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:28.360465 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.360374 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc"] Apr 22 14:36:28.771221 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.771175 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" event={"ID":"c66d796b-2624-4865-9992-c1963e92fdab","Type":"ContainerStarted","Data":"1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a"} Apr 22 14:36:28.771396 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:28.771231 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" event={"ID":"c66d796b-2624-4865-9992-c1963e92fdab","Type":"ContainerStarted","Data":"0cad2ca26e5e370587b65be428694ab5fad40853ff612e9a626f841505ad88ab"} Apr 22 14:36:30.623834 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.623806 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-8ff8bc774-bcw5h_a8a6a753-2348-4c2d-9680-53a5536893e1/main/0.log" Apr 22 14:36:30.624301 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.624249 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:36:30.660614 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.660582 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-home\") pod \"a8a6a753-2348-4c2d-9680-53a5536893e1\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " Apr 22 14:36:30.660808 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.660706 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk54m\" (UniqueName: \"kubernetes.io/projected/a8a6a753-2348-4c2d-9680-53a5536893e1-kube-api-access-zk54m\") pod \"a8a6a753-2348-4c2d-9680-53a5536893e1\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " Apr 22 14:36:30.660808 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.660746 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-kserve-provision-location\") pod \"a8a6a753-2348-4c2d-9680-53a5536893e1\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " Apr 22 14:36:30.660808 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.660796 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-dshm\") pod \"a8a6a753-2348-4c2d-9680-53a5536893e1\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " Apr 22 14:36:30.660994 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.660847 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8a6a753-2348-4c2d-9680-53a5536893e1-tls-certs\") pod \"a8a6a753-2348-4c2d-9680-53a5536893e1\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " Apr 22 14:36:30.660994 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.660872 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-model-cache\") pod \"a8a6a753-2348-4c2d-9680-53a5536893e1\" (UID: \"a8a6a753-2348-4c2d-9680-53a5536893e1\") " Apr 22 14:36:30.661113 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.661076 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-home" (OuterVolumeSpecName: "home") pod "a8a6a753-2348-4c2d-9680-53a5536893e1" (UID: "a8a6a753-2348-4c2d-9680-53a5536893e1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:36:30.661367 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.661328 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-model-cache" (OuterVolumeSpecName: "model-cache") pod "a8a6a753-2348-4c2d-9680-53a5536893e1" (UID: "a8a6a753-2348-4c2d-9680-53a5536893e1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:36:30.663750 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.663617 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a6a753-2348-4c2d-9680-53a5536893e1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a8a6a753-2348-4c2d-9680-53a5536893e1" (UID: "a8a6a753-2348-4c2d-9680-53a5536893e1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:36:30.663750 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.663632 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-dshm" (OuterVolumeSpecName: "dshm") pod "a8a6a753-2348-4c2d-9680-53a5536893e1" (UID: "a8a6a753-2348-4c2d-9680-53a5536893e1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:36:30.663750 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.663728 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a6a753-2348-4c2d-9680-53a5536893e1-kube-api-access-zk54m" (OuterVolumeSpecName: "kube-api-access-zk54m") pod "a8a6a753-2348-4c2d-9680-53a5536893e1" (UID: "a8a6a753-2348-4c2d-9680-53a5536893e1"). InnerVolumeSpecName "kube-api-access-zk54m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:36:30.719085 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.719035 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8a6a753-2348-4c2d-9680-53a5536893e1" (UID: "a8a6a753-2348-4c2d-9680-53a5536893e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:36:30.762772 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.762729 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zk54m\" (UniqueName: \"kubernetes.io/projected/a8a6a753-2348-4c2d-9680-53a5536893e1-kube-api-access-zk54m\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:36:30.762772 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.762771 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:36:30.762977 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.762789 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:36:30.762977 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.762806 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8a6a753-2348-4c2d-9680-53a5536893e1-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:36:30.762977 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.762823 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:36:30.762977 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.762835 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8a6a753-2348-4c2d-9680-53a5536893e1-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:36:30.782743 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.782709 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-8ff8bc774-bcw5h_a8a6a753-2348-4c2d-9680-53a5536893e1/main/0.log" Apr 22 14:36:30.783142 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.783111 2562 generic.go:358] "Generic (PLEG): container finished" podID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerID="7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2" exitCode=137 Apr 22 14:36:30.783273 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.783205 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" Apr 22 14:36:30.783273 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.783205 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" event={"ID":"a8a6a753-2348-4c2d-9680-53a5536893e1","Type":"ContainerDied","Data":"7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2"} Apr 22 14:36:30.783273 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.783260 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h" event={"ID":"a8a6a753-2348-4c2d-9680-53a5536893e1","Type":"ContainerDied","Data":"3a77e5cc9e3e2f2165d75e5df54429b3a19689f85c3e789a9f715b99e8ab99c5"} Apr 22 14:36:30.783428 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.783283 2562 scope.go:117] "RemoveContainer" containerID="7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2" Apr 22 14:36:30.809709 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.809679 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h"] Apr 22 14:36:30.816067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.816039 2562 scope.go:117] "RemoveContainer" containerID="964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d" Apr 22 14:36:30.816180 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.816064 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-8ff8bc774-bcw5h"] Apr 22 14:36:30.881325 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.881299 2562 scope.go:117] "RemoveContainer" containerID="7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2" Apr 22 14:36:30.881706 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:36:30.881678 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2\": container with ID starting with 7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2 not found: ID does not exist" containerID="7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2" Apr 22 14:36:30.881836 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.881719 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2"} err="failed to get container status \"7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2\": rpc error: code = NotFound desc = could not find container \"7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2\": container with ID starting with 7038a08bdeee13ecb15d228b646fe49015c4142ccc114390d26e217d28e2c1a2 not found: ID does not exist" Apr 22 14:36:30.881836 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.881747 2562 scope.go:117] "RemoveContainer" containerID="964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d" Apr 22 14:36:30.882113 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:36:30.882077 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d\": container with ID starting with 964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d not found: ID does not exist" containerID="964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d" Apr 22 14:36:30.882191 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.882112 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d"} err="failed to get container status \"964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d\": rpc error: code = NotFound desc = could not find container \"964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d\": container with ID starting with 964383cd2210c647cb885ee9ea82b2ee0d94c70cda0586daf51fe3a5c861bc1d not found: ID does not exist" Apr 22 14:36:30.926079 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:30.926040 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" path="/var/lib/kubelet/pods/a8a6a753-2348-4c2d-9680-53a5536893e1/volumes" Apr 22 14:36:33.797710 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:33.797670 2562 generic.go:358] "Generic (PLEG): container finished" podID="c66d796b-2624-4865-9992-c1963e92fdab" containerID="1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a" exitCode=0 Apr 22 14:36:33.798086 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:33.797744 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" event={"ID":"c66d796b-2624-4865-9992-c1963e92fdab","Type":"ContainerDied","Data":"1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a"} Apr 22 14:36:34.804854 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:34.804811 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" event={"ID":"c66d796b-2624-4865-9992-c1963e92fdab","Type":"ContainerStarted","Data":"354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9"} Apr 22 14:36:34.830738 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:34.830670 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podStartSLOduration=7.830632315 podStartE2EDuration="7.830632315s" podCreationTimestamp="2026-04-22 14:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:36:34.825866676 +0000 UTC m=+1298.525227359" watchObservedRunningTime="2026-04-22 14:36:34.830632315 +0000 UTC m=+1298.529992998" Apr 22 14:36:36.428606 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:36.428552 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:36:36.447370 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:36.447336 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:36:36.466018 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:36.465973 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:36:38.211924 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:38.211875 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:38.212334 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:38.211943 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:36:38.213464 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:38.213425 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:36:46.428411 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:46.428358 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:36:46.465697 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:46.465625 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:36:48.212360 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:48.212312 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:36:56.428173 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:56.428121 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:36:56.465341 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:56.465293 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:36:58.212394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:36:58.212347 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:37:06.428554 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:06.428498 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:37:06.466002 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:06.465956 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:37:08.211907 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:08.211852 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:37:16.428676 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:16.428539 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:37:16.465584 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:16.465534 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:37:18.212956 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:18.212901 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:37:26.427509 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:26.427456 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:37:26.465180 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:26.465140 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:37:28.212434 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:28.212366 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:37:36.428391 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:36.428320 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:37:36.465538 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:36.465495 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:37:38.212659 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:38.212603 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:37:46.427950 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:46.427896 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:37:46.465993 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:46.465953 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:37:48.212071 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:48.212018 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:37:56.427985 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:56.427924 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:37:56.465459 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:56.465415 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:37:58.212776 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:37:58.212733 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:38:06.428264 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:06.428214 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:38:06.466053 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:06.466002 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:38:08.212438 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:08.212394 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:38:16.428594 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:16.428531 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:38:16.465772 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:16.465734 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:38:18.212622 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:18.212574 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:38:26.428224 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:26.428177 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:38:26.465363 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:26.465316 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:38:28.212334 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:28.212290 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:38:36.428430 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:36.428381 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:38:36.465517 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:36.465474 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:38:38.211874 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:38.211822 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:38:46.427923 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:46.427829 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:38:46.465247 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:46.465203 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:38:48.212041 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:48.211991 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:38:56.427971 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:56.427924 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:38:56.466022 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:56.465975 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:38:58.212583 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:38:58.212530 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:39:06.427674 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:06.427611 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:39:06.465383 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:06.465336 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" probeResult="failure" output="Get \"https://10.133.0.50:8000/health\": dial tcp 10.133.0.50:8000: connect: connection refused" Apr 22 14:39:08.212441 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:08.212387 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:39:16.427561 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:16.427511 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.49:8001/health\": dial tcp 10.133.0.49:8001: connect: connection refused" Apr 22 14:39:16.475673 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:16.475620 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:39:16.483319 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:16.483286 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:39:18.212416 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:18.212370 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" probeResult="failure" output="Get \"https://10.133.0.51:8000/health\": dial tcp 10.133.0.51:8000: connect: connection refused" Apr 22 14:39:26.437669 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:26.437620 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:39:26.450204 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:26.450178 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:39:28.230457 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:28.230418 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:39:28.240628 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:28.240593 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:39:36.718705 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:36.718665 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc"] Apr 22 14:39:36.719288 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:36.718950 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" containerID="cri-o://354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9" gracePeriod=30 Apr 22 14:39:56.945620 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:56.945592 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:39:56.948665 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:39:56.948628 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:40:06.998440 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:06.998410 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc_c66d796b-2624-4865-9992-c1963e92fdab/main/0.log" Apr 22 14:40:06.998860 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:06.998794 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:40:07.125774 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.125736 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-kserve-provision-location\") pod \"c66d796b-2624-4865-9992-c1963e92fdab\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " Apr 22 14:40:07.125969 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.125789 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-dshm\") pod \"c66d796b-2624-4865-9992-c1963e92fdab\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " Apr 22 14:40:07.125969 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.125819 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c66d796b-2624-4865-9992-c1963e92fdab-tls-certs\") pod \"c66d796b-2624-4865-9992-c1963e92fdab\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " Apr 22 14:40:07.125969 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.125839 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-home\") pod \"c66d796b-2624-4865-9992-c1963e92fdab\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " Apr 22 14:40:07.125969 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.125910 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmpfr\" (UniqueName: \"kubernetes.io/projected/c66d796b-2624-4865-9992-c1963e92fdab-kube-api-access-cmpfr\") pod \"c66d796b-2624-4865-9992-c1963e92fdab\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " Apr 22 14:40:07.125969 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.125942 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-model-cache\") pod \"c66d796b-2624-4865-9992-c1963e92fdab\" (UID: \"c66d796b-2624-4865-9992-c1963e92fdab\") " Apr 22 14:40:07.126399 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.126369 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-home" (OuterVolumeSpecName: "home") pod "c66d796b-2624-4865-9992-c1963e92fdab" (UID: "c66d796b-2624-4865-9992-c1963e92fdab"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:07.126399 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.126385 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-model-cache" (OuterVolumeSpecName: "model-cache") pod "c66d796b-2624-4865-9992-c1963e92fdab" (UID: "c66d796b-2624-4865-9992-c1963e92fdab"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:07.128054 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.128023 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-dshm" (OuterVolumeSpecName: "dshm") pod "c66d796b-2624-4865-9992-c1963e92fdab" (UID: "c66d796b-2624-4865-9992-c1963e92fdab"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:07.128159 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.128087 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66d796b-2624-4865-9992-c1963e92fdab-kube-api-access-cmpfr" (OuterVolumeSpecName: "kube-api-access-cmpfr") pod "c66d796b-2624-4865-9992-c1963e92fdab" (UID: "c66d796b-2624-4865-9992-c1963e92fdab"). InnerVolumeSpecName "kube-api-access-cmpfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:40:07.128425 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.128410 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66d796b-2624-4865-9992-c1963e92fdab-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c66d796b-2624-4865-9992-c1963e92fdab" (UID: "c66d796b-2624-4865-9992-c1963e92fdab"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:40:07.145625 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.145586 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c66d796b-2624-4865-9992-c1963e92fdab" (UID: "c66d796b-2624-4865-9992-c1963e92fdab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:07.227521 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.227482 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:07.227521 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.227513 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:07.227521 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.227525 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:07.227786 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.227534 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c66d796b-2624-4865-9992-c1963e92fdab-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:07.227786 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.227543 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c66d796b-2624-4865-9992-c1963e92fdab-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:07.227786 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.227551 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cmpfr\" (UniqueName: \"kubernetes.io/projected/c66d796b-2624-4865-9992-c1963e92fdab-kube-api-access-cmpfr\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:07.667642 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.667612 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc_c66d796b-2624-4865-9992-c1963e92fdab/main/0.log" Apr 22 14:40:07.668003 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.667976 2562 generic.go:358] "Generic (PLEG): container finished" podID="c66d796b-2624-4865-9992-c1963e92fdab" containerID="354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9" exitCode=137 Apr 22 14:40:07.668090 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.668065 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" event={"ID":"c66d796b-2624-4865-9992-c1963e92fdab","Type":"ContainerDied","Data":"354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9"} Apr 22 14:40:07.668130 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.668111 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" event={"ID":"c66d796b-2624-4865-9992-c1963e92fdab","Type":"ContainerDied","Data":"0cad2ca26e5e370587b65be428694ab5fad40853ff612e9a626f841505ad88ab"} Apr 22 14:40:07.668130 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.668127 2562 scope.go:117] "RemoveContainer" containerID="354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9" Apr 22 14:40:07.668194 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.668075 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc" Apr 22 14:40:07.689412 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.689390 2562 scope.go:117] "RemoveContainer" containerID="1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a" Apr 22 14:40:07.693722 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.693691 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc"] Apr 22 14:40:07.698436 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.698402 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8b8dbf45b-cb2tc"] Apr 22 14:40:07.701616 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.701594 2562 scope.go:117] "RemoveContainer" containerID="354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9" Apr 22 14:40:07.701907 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:40:07.701889 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9\": container with ID starting with 354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9 not found: ID does not exist" containerID="354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9" Apr 22 14:40:07.701983 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.701915 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9"} err="failed to get container status \"354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9\": rpc error: code = NotFound desc = could not find container \"354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9\": container with ID starting with 354b3eb3b5030a9706215ad4b8d636014696621cbae1dfa90f91eb6f0edd67f9 not found: ID does not exist" Apr 22 14:40:07.701983 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.701932 2562 scope.go:117] "RemoveContainer" containerID="1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a" Apr 22 14:40:07.702148 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:40:07.702132 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a\": container with ID starting with 1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a not found: ID does not exist" containerID="1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a" Apr 22 14:40:07.702207 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:07.702152 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a"} err="failed to get container status \"1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a\": rpc error: code = NotFound desc = could not find container \"1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a\": container with ID starting with 1d4e9c868109b67006216da76d6be73d2a9dab47753de2d5b3b82bc47f7ab89a not found: ID does not exist" Apr 22 14:40:08.925621 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:08.925587 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c66d796b-2624-4865-9992-c1963e92fdab" path="/var/lib/kubelet/pods/c66d796b-2624-4865-9992-c1963e92fdab/volumes" Apr 22 14:40:09.430162 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430127 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 14:40:09.430486 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430475 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="storage-initializer" Apr 22 14:40:09.430545 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430499 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="storage-initializer" Apr 22 14:40:09.430545 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430509 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" Apr 22 14:40:09.430545 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430514 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" Apr 22 14:40:09.430545 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430523 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" Apr 22 14:40:09.430545 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430529 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" Apr 22 14:40:09.430545 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430539 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="storage-initializer" Apr 22 14:40:09.430545 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430545 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="storage-initializer" Apr 22 14:40:09.430788 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430598 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c66d796b-2624-4865-9992-c1963e92fdab" containerName="main" Apr 22 14:40:09.430788 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.430606 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8a6a753-2348-4c2d-9680-53a5536893e1" containerName="main" Apr 22 14:40:09.435807 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.435786 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.440607 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.439778 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-zwtrk\"" Apr 22 14:40:09.440607 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.440031 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 14:40:09.443814 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.443738 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 14:40:09.550349 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.550310 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.550529 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.550352 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4smp\" (UniqueName: \"kubernetes.io/projected/424839eb-8e4b-460e-96c6-e720b77e1a92-kube-api-access-z4smp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.550529 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.550459 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/424839eb-8e4b-460e-96c6-e720b77e1a92-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.550529 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.550519 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.550736 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.550552 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.550736 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.550640 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.651640 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.651602 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.651837 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.651642 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4smp\" (UniqueName: \"kubernetes.io/projected/424839eb-8e4b-460e-96c6-e720b77e1a92-kube-api-access-z4smp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.651837 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.651730 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/424839eb-8e4b-460e-96c6-e720b77e1a92-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.651837 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.651778 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.651837 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.651821 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.652055 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.651897 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.652055 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.652005 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.652170 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.652068 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.652170 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.652148 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.653968 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.653937 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.654147 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.654132 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/424839eb-8e4b-460e-96c6-e720b77e1a92-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.660141 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.660121 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4smp\" (UniqueName: \"kubernetes.io/projected/424839eb-8e4b-460e-96c6-e720b77e1a92-kube-api-access-z4smp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.748298 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.748201 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:09.878405 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.878366 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 14:40:09.879192 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:40:09.879167 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod424839eb_8e4b_460e_96c6_e720b77e1a92.slice/crio-df14f50ac3775f3024fbdd3ae6f425ee355a75d08cda2e1e1bce571eee8e70cd WatchSource:0}: Error finding container df14f50ac3775f3024fbdd3ae6f425ee355a75d08cda2e1e1bce571eee8e70cd: Status 404 returned error can't find the container with id df14f50ac3775f3024fbdd3ae6f425ee355a75d08cda2e1e1bce571eee8e70cd Apr 22 14:40:09.881117 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:09.881097 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:40:10.683038 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:10.682995 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"424839eb-8e4b-460e-96c6-e720b77e1a92","Type":"ContainerStarted","Data":"e919bc7966ab4d3d4d8ddcd502acce8381394c4ab60a0ba1b00c43dba034cee5"} Apr 22 14:40:10.683038 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:10.683033 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"424839eb-8e4b-460e-96c6-e720b77e1a92","Type":"ContainerStarted","Data":"df14f50ac3775f3024fbdd3ae6f425ee355a75d08cda2e1e1bce571eee8e70cd"} Apr 22 14:40:11.595607 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:11.595572 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx"] Apr 22 14:40:11.596025 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:11.595988 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" containerID="cri-o://c4393ed6d5f21ab2a393b6e25ce197ed07c23af0394ac35c20298cffeffe1837" gracePeriod=30 Apr 22 14:40:11.604875 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:11.604819 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f"] Apr 22 14:40:11.605323 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:11.605296 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" containerID="cri-o://0bc9b90fb78448d24e4128fc787b0b2f5fb0a3d7af02bcaefadbb0f32b71d1fe" gracePeriod=30 Apr 22 14:40:15.703430 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:15.703393 2562 generic.go:358] "Generic (PLEG): container finished" podID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerID="e919bc7966ab4d3d4d8ddcd502acce8381394c4ab60a0ba1b00c43dba034cee5" exitCode=0 Apr 22 14:40:15.703798 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:15.703467 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"424839eb-8e4b-460e-96c6-e720b77e1a92","Type":"ContainerDied","Data":"e919bc7966ab4d3d4d8ddcd502acce8381394c4ab60a0ba1b00c43dba034cee5"} Apr 22 14:40:16.709067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:16.709028 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"424839eb-8e4b-460e-96c6-e720b77e1a92","Type":"ContainerStarted","Data":"5289f75ae19b12b88f68603687dd88f364d9bcf61e9c2a8df2ad2d2dbd27fce8"} Apr 22 14:40:16.729213 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:16.729160 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.729144135 podStartE2EDuration="7.729144135s" podCreationTimestamp="2026-04-22 14:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:40:16.727468912 +0000 UTC m=+1520.426829594" watchObservedRunningTime="2026-04-22 14:40:16.729144135 +0000 UTC m=+1520.428504816" Apr 22 14:40:19.748825 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:19.748768 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:19.750273 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:19.750238 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:40:25.906010 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:25.905969 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc"] Apr 22 14:40:25.936568 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:25.936534 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc"] Apr 22 14:40:25.936875 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:25.936852 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq"] Apr 22 14:40:25.936978 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:25.936755 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:25.940129 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:25.940101 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-qx6qf\"" Apr 22 14:40:25.940263 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:25.940104 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 14:40:25.958448 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:25.958419 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq"] Apr 22 14:40:25.958575 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:25.958502 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.000757 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.000718 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.000952 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.000772 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.000952 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.000792 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlzk5\" (UniqueName: \"kubernetes.io/projected/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kube-api-access-vlzk5\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.000952 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.000854 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.000952 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.000919 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-home\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.000952 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.000951 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.001186 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.000982 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.001186 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.001077 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-model-cache\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.001186 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.001123 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.001186 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.001155 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-dshm\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.001332 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.001208 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9274be9-14a0-4079-940c-0b45c6502132-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.001332 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.001236 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9bvb\" (UniqueName: \"kubernetes.io/projected/d9274be9-14a0-4079-940c-0b45c6502132-kube-api-access-g9bvb\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.101747 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.101711 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-model-cache\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.101917 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.101763 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.101917 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.101787 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-dshm\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.101917 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.101835 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9274be9-14a0-4079-940c-0b45c6502132-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.101917 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.101861 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9bvb\" (UniqueName: \"kubernetes.io/projected/d9274be9-14a0-4079-940c-0b45c6502132-kube-api-access-g9bvb\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.102125 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102007 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.102125 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102097 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.102231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102128 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlzk5\" (UniqueName: \"kubernetes.io/projected/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kube-api-access-vlzk5\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.102231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102160 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.102231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102174 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-model-cache\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.102231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102199 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-home\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.102231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102226 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.102478 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102262 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.102555 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102483 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.102613 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102591 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.102797 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102771 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-home\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.102797 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102785 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.102960 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.102870 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.104381 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.104352 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.104611 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.104588 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-dshm\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.104872 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.104854 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9274be9-14a0-4079-940c-0b45c6502132-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.104921 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.104892 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.110315 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.110291 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9bvb\" (UniqueName: \"kubernetes.io/projected/d9274be9-14a0-4079-940c-0b45c6502132-kube-api-access-g9bvb\") pod \"custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.110384 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.110344 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlzk5\" (UniqueName: \"kubernetes.io/projected/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kube-api-access-vlzk5\") pod \"custom-route-timeout-pd-test-kserve-c58f8844b-v8blc\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.249383 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.249291 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:26.269281 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.269243 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:26.409106 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.409066 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc"] Apr 22 14:40:26.410945 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:40:26.410917 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c3a348_63cb_46e0_bbb1_dd9296f1d5df.slice/crio-b8057d851d29951217ba7c863209a9b38f7f019dcf2d9910aae811b6f2c89f13 WatchSource:0}: Error finding container b8057d851d29951217ba7c863209a9b38f7f019dcf2d9910aae811b6f2c89f13: Status 404 returned error can't find the container with id b8057d851d29951217ba7c863209a9b38f7f019dcf2d9910aae811b6f2c89f13 Apr 22 14:40:26.432936 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.432910 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq"] Apr 22 14:40:26.434291 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:40:26.434262 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9274be9_14a0_4079_940c_0b45c6502132.slice/crio-1a94f708dce4207ec7e9f8ff8d07119d1ebe876617afd6a2717dc88c2c4ca925 WatchSource:0}: Error finding container 1a94f708dce4207ec7e9f8ff8d07119d1ebe876617afd6a2717dc88c2c4ca925: Status 404 returned error can't find the container with id 1a94f708dce4207ec7e9f8ff8d07119d1ebe876617afd6a2717dc88c2c4ca925 Apr 22 14:40:26.749587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.749548 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" event={"ID":"d9274be9-14a0-4079-940c-0b45c6502132","Type":"ContainerStarted","Data":"e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30"} Apr 22 14:40:26.749778 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.749593 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" event={"ID":"d9274be9-14a0-4079-940c-0b45c6502132","Type":"ContainerStarted","Data":"1a94f708dce4207ec7e9f8ff8d07119d1ebe876617afd6a2717dc88c2c4ca925"} Apr 22 14:40:26.751161 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.751132 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" event={"ID":"30c3a348-63cb-46e0-bbb1-dd9296f1d5df","Type":"ContainerStarted","Data":"2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b"} Apr 22 14:40:26.751275 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.751167 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" event={"ID":"30c3a348-63cb-46e0-bbb1-dd9296f1d5df","Type":"ContainerStarted","Data":"b8057d851d29951217ba7c863209a9b38f7f019dcf2d9910aae811b6f2c89f13"} Apr 22 14:40:26.751275 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:26.751234 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:27.763980 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:27.763700 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" event={"ID":"30c3a348-63cb-46e0-bbb1-dd9296f1d5df","Type":"ContainerStarted","Data":"62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a"} Apr 22 14:40:29.749894 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:29.749826 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:40:31.786188 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:31.786110 2562 generic.go:358] "Generic (PLEG): container finished" podID="d9274be9-14a0-4079-940c-0b45c6502132" containerID="e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30" exitCode=0 Apr 22 14:40:31.786635 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:31.786182 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" event={"ID":"d9274be9-14a0-4079-940c-0b45c6502132","Type":"ContainerDied","Data":"e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30"} Apr 22 14:40:32.793388 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:32.793348 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" event={"ID":"d9274be9-14a0-4079-940c-0b45c6502132","Type":"ContainerStarted","Data":"5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60"} Apr 22 14:40:32.795184 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:32.795151 2562 generic.go:358] "Generic (PLEG): container finished" podID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerID="62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a" exitCode=0 Apr 22 14:40:32.795306 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:32.795208 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" event={"ID":"30c3a348-63cb-46e0-bbb1-dd9296f1d5df","Type":"ContainerDied","Data":"62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a"} Apr 22 14:40:32.816379 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:32.816321 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podStartSLOduration=7.816301658 podStartE2EDuration="7.816301658s" podCreationTimestamp="2026-04-22 14:40:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:40:32.813117446 +0000 UTC m=+1536.512478166" watchObservedRunningTime="2026-04-22 14:40:32.816301658 +0000 UTC m=+1536.515662341" Apr 22 14:40:33.803009 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:33.802960 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" event={"ID":"30c3a348-63cb-46e0-bbb1-dd9296f1d5df","Type":"ContainerStarted","Data":"cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8"} Apr 22 14:40:33.825287 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:33.825221 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podStartSLOduration=8.825202348 podStartE2EDuration="8.825202348s" podCreationTimestamp="2026-04-22 14:40:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:40:33.823954099 +0000 UTC m=+1537.523314818" watchObservedRunningTime="2026-04-22 14:40:33.825202348 +0000 UTC m=+1537.524563024" Apr 22 14:40:36.249872 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:36.249830 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:36.269600 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:36.249890 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:36.269600 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:36.251253 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:40:36.269600 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:36.268203 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:40:36.270265 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:36.270236 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:36.270390 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:36.270283 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:40:36.271762 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:36.271733 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:40:39.748893 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:39.748848 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:40:39.749406 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:39.749209 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:40:41.606288 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.606240 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="llm-d-routing-sidecar" containerID="cri-o://2178ad3b78fbe363127dc4af327cf12b568675e38fcff857106729f4767d37b7" gracePeriod=2 Apr 22 14:40:41.847235 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.847143 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f_8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b/main/0.log" Apr 22 14:40:41.848636 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.848387 2562 generic.go:358] "Generic (PLEG): container finished" podID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerID="0bc9b90fb78448d24e4128fc787b0b2f5fb0a3d7af02bcaefadbb0f32b71d1fe" exitCode=137 Apr 22 14:40:41.848636 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.848419 2562 generic.go:358] "Generic (PLEG): container finished" podID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerID="2178ad3b78fbe363127dc4af327cf12b568675e38fcff857106729f4767d37b7" exitCode=0 Apr 22 14:40:41.848636 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.848556 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" event={"ID":"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b","Type":"ContainerDied","Data":"0bc9b90fb78448d24e4128fc787b0b2f5fb0a3d7af02bcaefadbb0f32b71d1fe"} Apr 22 14:40:41.848636 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.848587 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" event={"ID":"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b","Type":"ContainerDied","Data":"2178ad3b78fbe363127dc4af327cf12b568675e38fcff857106729f4767d37b7"} Apr 22 14:40:41.851307 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.851166 2562 generic.go:358] "Generic (PLEG): container finished" podID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerID="c4393ed6d5f21ab2a393b6e25ce197ed07c23af0394ac35c20298cffeffe1837" exitCode=137 Apr 22 14:40:41.851307 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.851285 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" event={"ID":"1208001a-27e8-45e7-8213-f205b5eb60ec","Type":"ContainerDied","Data":"c4393ed6d5f21ab2a393b6e25ce197ed07c23af0394ac35c20298cffeffe1837"} Apr 22 14:40:41.925449 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.925416 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f_8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b/main/0.log" Apr 22 14:40:41.926231 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.926206 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:40:41.953904 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:41.953866 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:40:42.082351 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082312 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-model-cache\") pod \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " Apr 22 14:40:42.082351 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082369 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-dshm\") pod \"1208001a-27e8-45e7-8213-f205b5eb60ec\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " Apr 22 14:40:42.082637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082421 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-tls-certs\") pod \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " Apr 22 14:40:42.082637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082446 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5xjf\" (UniqueName: \"kubernetes.io/projected/1208001a-27e8-45e7-8213-f205b5eb60ec-kube-api-access-g5xjf\") pod \"1208001a-27e8-45e7-8213-f205b5eb60ec\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " Apr 22 14:40:42.082637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082471 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1208001a-27e8-45e7-8213-f205b5eb60ec-tls-certs\") pod \"1208001a-27e8-45e7-8213-f205b5eb60ec\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " Apr 22 14:40:42.082637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082515 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-home\") pod \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " Apr 22 14:40:42.082637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082544 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-home\") pod \"1208001a-27e8-45e7-8213-f205b5eb60ec\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " Apr 22 14:40:42.082637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082591 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz788\" (UniqueName: \"kubernetes.io/projected/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kube-api-access-tz788\") pod \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " Apr 22 14:40:42.082637 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082636 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kserve-provision-location\") pod \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " Apr 22 14:40:42.083156 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082717 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-dshm\") pod \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\" (UID: \"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b\") " Apr 22 14:40:42.083156 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082754 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-kserve-provision-location\") pod \"1208001a-27e8-45e7-8213-f205b5eb60ec\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " Apr 22 14:40:42.083156 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.082784 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-model-cache\") pod \"1208001a-27e8-45e7-8213-f205b5eb60ec\" (UID: \"1208001a-27e8-45e7-8213-f205b5eb60ec\") " Apr 22 14:40:42.083156 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.083137 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-home" (OuterVolumeSpecName: "home") pod "1208001a-27e8-45e7-8213-f205b5eb60ec" (UID: "1208001a-27e8-45e7-8213-f205b5eb60ec"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:42.083401 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.083255 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-model-cache" (OuterVolumeSpecName: "model-cache") pod "1208001a-27e8-45e7-8213-f205b5eb60ec" (UID: "1208001a-27e8-45e7-8213-f205b5eb60ec"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:42.083696 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.083636 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-model-cache" (OuterVolumeSpecName: "model-cache") pod "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" (UID: "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:42.086219 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.086043 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1208001a-27e8-45e7-8213-f205b5eb60ec-kube-api-access-g5xjf" (OuterVolumeSpecName: "kube-api-access-g5xjf") pod "1208001a-27e8-45e7-8213-f205b5eb60ec" (UID: "1208001a-27e8-45e7-8213-f205b5eb60ec"). InnerVolumeSpecName "kube-api-access-g5xjf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:40:42.087575 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.087543 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-dshm" (OuterVolumeSpecName: "dshm") pod "1208001a-27e8-45e7-8213-f205b5eb60ec" (UID: "1208001a-27e8-45e7-8213-f205b5eb60ec"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:42.087966 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.087941 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-home" (OuterVolumeSpecName: "home") pod "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" (UID: "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:42.088126 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.088097 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-dshm" (OuterVolumeSpecName: "dshm") pod "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" (UID: "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:42.090153 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.090113 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1208001a-27e8-45e7-8213-f205b5eb60ec-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1208001a-27e8-45e7-8213-f205b5eb60ec" (UID: "1208001a-27e8-45e7-8213-f205b5eb60ec"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:40:42.091115 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.091076 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" (UID: "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:40:42.093708 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.093006 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kube-api-access-tz788" (OuterVolumeSpecName: "kube-api-access-tz788") pod "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" (UID: "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b"). InnerVolumeSpecName "kube-api-access-tz788". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:40:42.173204 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.173102 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" (UID: "8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:42.184283 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184241 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184283 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184278 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tz788\" (UniqueName: \"kubernetes.io/projected/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kube-api-access-tz788\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184297 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184311 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184325 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184337 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184348 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184360 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184373 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5xjf\" (UniqueName: \"kubernetes.io/projected/1208001a-27e8-45e7-8213-f205b5eb60ec-kube-api-access-g5xjf\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184386 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1208001a-27e8-45e7-8213-f205b5eb60ec-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.184522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.184399 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.187861 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.187819 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1208001a-27e8-45e7-8213-f205b5eb60ec" (UID: "1208001a-27e8-45e7-8213-f205b5eb60ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:42.284915 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.284880 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1208001a-27e8-45e7-8213-f205b5eb60ec-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:40:42.858029 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.857986 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" event={"ID":"1208001a-27e8-45e7-8213-f205b5eb60ec","Type":"ContainerDied","Data":"007b24f4d20339611a8feb47c6ea57bccfc1d19b069b46f71ec39c891d3bfa6d"} Apr 22 14:40:42.858513 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.858043 2562 scope.go:117] "RemoveContainer" containerID="c4393ed6d5f21ab2a393b6e25ce197ed07c23af0394ac35c20298cffeffe1837" Apr 22 14:40:42.858513 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.858001 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx" Apr 22 14:40:42.859825 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.859803 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f_8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b/main/0.log" Apr 22 14:40:42.863744 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.863700 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" event={"ID":"8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b","Type":"ContainerDied","Data":"d231c6887a533120dd71657cd352964bc51fc16ec41457d8ce1e2977e5ce210b"} Apr 22 14:40:42.863864 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.863768 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f" Apr 22 14:40:42.887302 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.887189 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx"] Apr 22 14:40:42.889202 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.888782 2562 scope.go:117] "RemoveContainer" containerID="169751c881de610d85657e5aa14a0c9fe277c85887cc12dde98acc915739e9e6" Apr 22 14:40:42.890435 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.890414 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-59kh5wx"] Apr 22 14:40:42.903454 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.903425 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f"] Apr 22 14:40:42.909664 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.909623 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-794cdc4487jx24f"] Apr 22 14:40:42.928176 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.928134 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" path="/var/lib/kubelet/pods/1208001a-27e8-45e7-8213-f205b5eb60ec/volumes" Apr 22 14:40:42.928769 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.928747 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" path="/var/lib/kubelet/pods/8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b/volumes" Apr 22 14:40:42.958973 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.958948 2562 scope.go:117] "RemoveContainer" containerID="0bc9b90fb78448d24e4128fc787b0b2f5fb0a3d7af02bcaefadbb0f32b71d1fe" Apr 22 14:40:42.992267 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:42.992240 2562 scope.go:117] "RemoveContainer" containerID="419bde872344537e5b20df4e4092d9472184b84c18cb1b654eb904d12681f641" Apr 22 14:40:43.041837 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:43.041809 2562 scope.go:117] "RemoveContainer" containerID="2178ad3b78fbe363127dc4af327cf12b568675e38fcff857106729f4767d37b7" Apr 22 14:40:46.250248 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:46.250200 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:40:46.269581 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:46.269542 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:40:49.748768 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:49.748715 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:40:56.251055 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:56.250773 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:40:56.270432 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:56.270385 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:40:59.749137 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:40:59.749085 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:41:06.249974 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:06.249922 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:41:06.270619 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:06.270576 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:41:09.749550 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:09.749501 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:41:16.249901 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:16.249847 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:41:16.270135 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:16.270087 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:41:19.749413 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:19.749350 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:41:26.250053 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:26.250002 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:41:26.269772 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:26.269730 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:41:29.748969 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:29.748922 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:41:36.250276 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:36.250206 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:41:36.270530 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:36.270471 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:41:39.748979 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:39.748935 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:41:46.250148 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:46.250051 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:41:46.270642 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:46.270596 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:41:49.749308 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:49.749257 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:41:56.250559 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:56.250509 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:41:56.270602 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:56.270560 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:41:59.748957 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:41:59.748902 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:42:06.250279 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:06.250230 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:42:06.269909 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:06.269873 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:42:09.748879 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:09.748835 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:42:16.250467 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:16.250400 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:42:16.269622 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:16.269578 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:42:19.749551 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:19.749503 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:42:26.249980 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:26.249924 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:42:26.269968 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:26.269914 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:42:29.749050 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:29.748998 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:42:36.250467 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:36.250416 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:42:36.269622 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:36.269574 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:42:39.748724 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:39.748682 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:42:46.250024 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:46.249976 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:42:46.269676 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:46.269618 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:42:49.749087 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:49.749037 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:42:56.249704 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:56.249640 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:42:56.269599 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:56.269548 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:42:59.748765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:42:59.748708 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:43:06.249899 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:06.249848 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:43:06.269735 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:06.269693 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" probeResult="failure" output="Get \"https://10.133.0.54:8000/health\": dial tcp 10.133.0.54:8000: connect: connection refused" Apr 22 14:43:09.748857 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:09.748807 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" probeResult="failure" output="Get \"https://10.133.0.52:8000/health\": dial tcp 10.133.0.52:8000: connect: connection refused" Apr 22 14:43:16.250247 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:16.250139 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" probeResult="failure" output="Get \"https://10.133.0.53:8001/health\": dial tcp 10.133.0.53:8001: connect: connection refused" Apr 22 14:43:16.285103 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:16.285072 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:43:16.519693 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:16.519588 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:43:19.758266 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:19.758235 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:43:19.766137 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:19.766109 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:43:26.259150 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:26.259115 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:43:26.272070 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:26.272041 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:43:27.545306 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:27.545267 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 14:43:27.545798 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:27.545629 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" containerID="cri-o://5289f75ae19b12b88f68603687dd88f364d9bcf61e9c2a8df2ad2d2dbd27fce8" gracePeriod=30 Apr 22 14:43:28.555355 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.555318 2562 generic.go:358] "Generic (PLEG): container finished" podID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerID="5289f75ae19b12b88f68603687dd88f364d9bcf61e9c2a8df2ad2d2dbd27fce8" exitCode=0 Apr 22 14:43:28.555769 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.555375 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"424839eb-8e4b-460e-96c6-e720b77e1a92","Type":"ContainerDied","Data":"5289f75ae19b12b88f68603687dd88f364d9bcf61e9c2a8df2ad2d2dbd27fce8"} Apr 22 14:43:28.599561 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.599535 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:43:28.773522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.773418 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-kserve-provision-location\") pod \"424839eb-8e4b-460e-96c6-e720b77e1a92\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " Apr 22 14:43:28.773522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.773472 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-dshm\") pod \"424839eb-8e4b-460e-96c6-e720b77e1a92\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " Apr 22 14:43:28.773522 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.773518 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4smp\" (UniqueName: \"kubernetes.io/projected/424839eb-8e4b-460e-96c6-e720b77e1a92-kube-api-access-z4smp\") pod \"424839eb-8e4b-460e-96c6-e720b77e1a92\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " Apr 22 14:43:28.773855 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.773558 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/424839eb-8e4b-460e-96c6-e720b77e1a92-tls-certs\") pod \"424839eb-8e4b-460e-96c6-e720b77e1a92\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " Apr 22 14:43:28.773855 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.773708 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-model-cache\") pod \"424839eb-8e4b-460e-96c6-e720b77e1a92\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " Apr 22 14:43:28.773855 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.773759 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-home\") pod \"424839eb-8e4b-460e-96c6-e720b77e1a92\" (UID: \"424839eb-8e4b-460e-96c6-e720b77e1a92\") " Apr 22 14:43:28.774185 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.774016 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-model-cache" (OuterVolumeSpecName: "model-cache") pod "424839eb-8e4b-460e-96c6-e720b77e1a92" (UID: "424839eb-8e4b-460e-96c6-e720b77e1a92"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:28.774185 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.774155 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:43:28.774413 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.774255 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-home" (OuterVolumeSpecName: "home") pod "424839eb-8e4b-460e-96c6-e720b77e1a92" (UID: "424839eb-8e4b-460e-96c6-e720b77e1a92"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:28.775761 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.775727 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424839eb-8e4b-460e-96c6-e720b77e1a92-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "424839eb-8e4b-460e-96c6-e720b77e1a92" (UID: "424839eb-8e4b-460e-96c6-e720b77e1a92"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:43:28.775880 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.775813 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424839eb-8e4b-460e-96c6-e720b77e1a92-kube-api-access-z4smp" (OuterVolumeSpecName: "kube-api-access-z4smp") pod "424839eb-8e4b-460e-96c6-e720b77e1a92" (UID: "424839eb-8e4b-460e-96c6-e720b77e1a92"). InnerVolumeSpecName "kube-api-access-z4smp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:43:28.776244 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.776213 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-dshm" (OuterVolumeSpecName: "dshm") pod "424839eb-8e4b-460e-96c6-e720b77e1a92" (UID: "424839eb-8e4b-460e-96c6-e720b77e1a92"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:28.820516 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.820464 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "424839eb-8e4b-460e-96c6-e720b77e1a92" (UID: "424839eb-8e4b-460e-96c6-e720b77e1a92"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:28.875245 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.875209 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:43:28.875245 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.875239 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4smp\" (UniqueName: \"kubernetes.io/projected/424839eb-8e4b-460e-96c6-e720b77e1a92-kube-api-access-z4smp\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:43:28.875245 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.875253 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/424839eb-8e4b-460e-96c6-e720b77e1a92-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:43:28.875483 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.875262 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:43:28.875483 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:28.875271 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424839eb-8e4b-460e-96c6-e720b77e1a92-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:43:29.561516 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:29.561479 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"424839eb-8e4b-460e-96c6-e720b77e1a92","Type":"ContainerDied","Data":"df14f50ac3775f3024fbdd3ae6f425ee355a75d08cda2e1e1bce571eee8e70cd"} Apr 22 14:43:29.561952 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:29.561528 2562 scope.go:117] "RemoveContainer" containerID="5289f75ae19b12b88f68603687dd88f364d9bcf61e9c2a8df2ad2d2dbd27fce8" Apr 22 14:43:29.561952 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:29.561559 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 14:43:29.581040 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:29.581005 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 14:43:29.583033 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:29.582876 2562 scope.go:117] "RemoveContainer" containerID="e919bc7966ab4d3d4d8ddcd502acce8381394c4ab60a0ba1b00c43dba034cee5" Apr 22 14:43:29.586540 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:29.586517 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 14:43:30.925906 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:30.925872 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" path="/var/lib/kubelet/pods/424839eb-8e4b-460e-96c6-e720b77e1a92/volumes" Apr 22 14:43:34.566944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.566911 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz"] Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567249 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="storage-initializer" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567260 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="storage-initializer" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567271 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="storage-initializer" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567277 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="storage-initializer" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567284 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567290 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567298 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="storage-initializer" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567304 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="storage-initializer" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567314 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567319 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567328 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567334 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567341 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="llm-d-routing-sidecar" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567346 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="llm-d-routing-sidecar" Apr 22 14:43:34.567394 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567398 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1208001a-27e8-45e7-8213-f205b5eb60ec" containerName="main" Apr 22 14:43:34.568210 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567406 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="llm-d-routing-sidecar" Apr 22 14:43:34.568210 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567412 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8fb4a4c2-6e6e-424c-b8b9-f1eb0491555b" containerName="main" Apr 22 14:43:34.568210 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.567420 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="424839eb-8e4b-460e-96c6-e720b77e1a92" containerName="main" Apr 22 14:43:34.573359 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.573334 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.576014 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.575986 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-gs9jp\"" Apr 22 14:43:34.586390 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.586363 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz"] Apr 22 14:43:34.634302 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.634265 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/35510640-4170-4d27-813b-866bd7f36d55-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.634479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.634306 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/35510640-4170-4d27-813b-866bd7f36d55-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.634479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.634357 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.634479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.634412 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.634479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.634435 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.634479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.634452 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.634823 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.634491 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.634823 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.634567 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/35510640-4170-4d27-813b-866bd7f36d55-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.634823 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.634605 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbnmj\" (UniqueName: \"kubernetes.io/projected/35510640-4170-4d27-813b-866bd7f36d55-kube-api-access-cbnmj\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.735301 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.735235 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/35510640-4170-4d27-813b-866bd7f36d55-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.735498 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.735303 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/35510640-4170-4d27-813b-866bd7f36d55-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.735498 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.735353 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.735498 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.735417 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.735498 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.735443 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.735498 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.735468 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.735825 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.735509 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.735825 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.735552 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/35510640-4170-4d27-813b-866bd7f36d55-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.735825 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.735589 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbnmj\" (UniqueName: \"kubernetes.io/projected/35510640-4170-4d27-813b-866bd7f36d55-kube-api-access-cbnmj\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.736414 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.736123 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/35510640-4170-4d27-813b-866bd7f36d55-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.736414 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.736230 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.736414 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.736308 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.736414 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.736378 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.736722 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.736464 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.738871 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.738757 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/35510640-4170-4d27-813b-866bd7f36d55-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.738871 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.738828 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/35510640-4170-4d27-813b-866bd7f36d55-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.744635 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.744587 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbnmj\" (UniqueName: \"kubernetes.io/projected/35510640-4170-4d27-813b-866bd7f36d55-kube-api-access-cbnmj\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.744766 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.744688 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/35510640-4170-4d27-813b-866bd7f36d55-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-jzrxz\" (UID: \"35510640-4170-4d27-813b-866bd7f36d55\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:34.888610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:34.888567 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:35.041703 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:35.041672 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz"] Apr 22 14:43:35.043511 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:43:35.043476 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35510640_4170_4d27_813b_866bd7f36d55.slice/crio-267981a0e08f766be02e14ed32c181fb386c845527ffccc6240667ca29c8c503 WatchSource:0}: Error finding container 267981a0e08f766be02e14ed32c181fb386c845527ffccc6240667ca29c8c503: Status 404 returned error can't find the container with id 267981a0e08f766be02e14ed32c181fb386c845527ffccc6240667ca29c8c503 Apr 22 14:43:35.045930 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:35.045887 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:43:35.046046 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:35.045974 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:43:35.046046 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:35.046016 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 14:43:35.591107 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:35.591063 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" event={"ID":"35510640-4170-4d27-813b-866bd7f36d55","Type":"ContainerStarted","Data":"c95a792e5d96e8172aa51d5274a6caab824f9b1862cba307ecc9ad4e81652667"} Apr 22 14:43:35.591107 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:35.591110 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" event={"ID":"35510640-4170-4d27-813b-866bd7f36d55","Type":"ContainerStarted","Data":"267981a0e08f766be02e14ed32c181fb386c845527ffccc6240667ca29c8c503"} Apr 22 14:43:35.611845 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:35.611793 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" podStartSLOduration=1.6117779639999998 podStartE2EDuration="1.611777964s" podCreationTimestamp="2026-04-22 14:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:43:35.610881 +0000 UTC m=+1719.310241682" watchObservedRunningTime="2026-04-22 14:43:35.611777964 +0000 UTC m=+1719.311138647" Apr 22 14:43:35.889424 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:35.889325 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:36.893910 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:36.893878 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:37.598952 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:37.598913 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:37.600005 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:37.599979 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-jzrxz" Apr 22 14:43:40.560462 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.560424 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls"] Apr 22 14:43:40.565096 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.565070 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.567932 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.567903 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 14:43:40.568063 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.567907 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-dctb6\"" Apr 22 14:43:40.575168 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.575143 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls"] Apr 22 14:43:40.580108 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.580080 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.580193 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.580133 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.580243 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.580196 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkj75\" (UniqueName: \"kubernetes.io/projected/ab098998-0b47-4cde-8c41-4ec4474e82c8-kube-api-access-kkj75\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.580280 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.580251 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab098998-0b47-4cde-8c41-4ec4474e82c8-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.580280 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.580276 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.580344 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.580332 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681120 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681084 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkj75\" (UniqueName: \"kubernetes.io/projected/ab098998-0b47-4cde-8c41-4ec4474e82c8-kube-api-access-kkj75\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681298 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681139 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab098998-0b47-4cde-8c41-4ec4474e82c8-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681298 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681172 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681298 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681226 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681298 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681250 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681456 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681304 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681725 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681639 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681725 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681709 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681837 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681725 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.681837 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.681741 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.683762 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.683745 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab098998-0b47-4cde-8c41-4ec4474e82c8-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.689544 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.689518 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkj75\" (UniqueName: \"kubernetes.io/projected/ab098998-0b47-4cde-8c41-4ec4474e82c8-kube-api-access-kkj75\") pod \"scheduler-inline-config-test-kserve-router-scheduler-8574822pls\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:40.875210 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:40.875168 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:41.016620 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:43:41.016582 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab098998_0b47_4cde_8c41_4ec4474e82c8.slice/crio-f8839401b69d57e32bf71befb39b122d5f018c04dcbb86010ee83a1c219128ee WatchSource:0}: Error finding container f8839401b69d57e32bf71befb39b122d5f018c04dcbb86010ee83a1c219128ee: Status 404 returned error can't find the container with id f8839401b69d57e32bf71befb39b122d5f018c04dcbb86010ee83a1c219128ee Apr 22 14:43:41.016740 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:41.016724 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls"] Apr 22 14:43:41.617166 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:41.617127 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" event={"ID":"ab098998-0b47-4cde-8c41-4ec4474e82c8","Type":"ContainerStarted","Data":"e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291"} Apr 22 14:43:41.617544 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:41.617172 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" event={"ID":"ab098998-0b47-4cde-8c41-4ec4474e82c8","Type":"ContainerStarted","Data":"f8839401b69d57e32bf71befb39b122d5f018c04dcbb86010ee83a1c219128ee"} Apr 22 14:43:42.622719 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:42.622678 2562 generic.go:358] "Generic (PLEG): container finished" podID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerID="e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291" exitCode=0 Apr 22 14:43:42.623189 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:42.622770 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" event={"ID":"ab098998-0b47-4cde-8c41-4ec4474e82c8","Type":"ContainerDied","Data":"e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291"} Apr 22 14:43:43.628399 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:43.628365 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" event={"ID":"ab098998-0b47-4cde-8c41-4ec4474e82c8","Type":"ContainerStarted","Data":"fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56"} Apr 22 14:43:43.628399 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:43.628405 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" event={"ID":"ab098998-0b47-4cde-8c41-4ec4474e82c8","Type":"ContainerStarted","Data":"299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3"} Apr 22 14:43:43.628829 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:43.628499 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:43.651507 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:43.651457 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" podStartSLOduration=3.65144185 podStartE2EDuration="3.65144185s" podCreationTimestamp="2026-04-22 14:43:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:43:43.64985316 +0000 UTC m=+1727.349213842" watchObservedRunningTime="2026-04-22 14:43:43.65144185 +0000 UTC m=+1727.350802532" Apr 22 14:43:44.303477 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:44.303437 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq"] Apr 22 14:43:44.304215 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:44.303852 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" containerID="cri-o://5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60" gracePeriod=30 Apr 22 14:43:44.312377 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:44.312345 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc"] Apr 22 14:43:44.312799 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:44.312763 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" containerID="cri-o://cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8" gracePeriod=30 Apr 22 14:43:50.875403 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:50.875361 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:50.875403 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:50.875411 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:50.878256 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:50.878231 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:43:51.663970 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:43:51.663938 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:44:12.667534 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:12.667502 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:44:14.313645 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.313601 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="llm-d-routing-sidecar" containerID="cri-o://2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b" gracePeriod=2 Apr 22 14:44:14.611354 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.611329 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-c58f8844b-v8blc_30c3a348-63cb-46e0-bbb1-dd9296f1d5df/main/0.log" Apr 22 14:44:14.612017 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.611999 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:44:14.629948 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.629918 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:44:14.697677 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.697622 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-home\") pod \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " Apr 22 14:44:14.698122 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.697705 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kserve-provision-location\") pod \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " Apr 22 14:44:14.698122 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.697749 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-tls-certs\") pod \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " Apr 22 14:44:14.698122 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.697804 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlzk5\" (UniqueName: \"kubernetes.io/projected/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kube-api-access-vlzk5\") pod \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " Apr 22 14:44:14.698122 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.697846 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-model-cache\") pod \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " Apr 22 14:44:14.698122 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.697903 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-dshm\") pod \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\" (UID: \"30c3a348-63cb-46e0-bbb1-dd9296f1d5df\") " Apr 22 14:44:14.698122 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.698012 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-home" (OuterVolumeSpecName: "home") pod "30c3a348-63cb-46e0-bbb1-dd9296f1d5df" (UID: "30c3a348-63cb-46e0-bbb1-dd9296f1d5df"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:14.698477 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.698166 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-model-cache" (OuterVolumeSpecName: "model-cache") pod "30c3a348-63cb-46e0-bbb1-dd9296f1d5df" (UID: "30c3a348-63cb-46e0-bbb1-dd9296f1d5df"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:14.698477 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.698333 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.698477 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.698355 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.700005 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.699982 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kube-api-access-vlzk5" (OuterVolumeSpecName: "kube-api-access-vlzk5") pod "30c3a348-63cb-46e0-bbb1-dd9296f1d5df" (UID: "30c3a348-63cb-46e0-bbb1-dd9296f1d5df"). InnerVolumeSpecName "kube-api-access-vlzk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:44:14.700005 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.699986 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-dshm" (OuterVolumeSpecName: "dshm") pod "30c3a348-63cb-46e0-bbb1-dd9296f1d5df" (UID: "30c3a348-63cb-46e0-bbb1-dd9296f1d5df"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:14.700479 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.700464 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "30c3a348-63cb-46e0-bbb1-dd9296f1d5df" (UID: "30c3a348-63cb-46e0-bbb1-dd9296f1d5df"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:44:14.751665 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.751616 2562 generic.go:358] "Generic (PLEG): container finished" podID="d9274be9-14a0-4079-940c-0b45c6502132" containerID="5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60" exitCode=137 Apr 22 14:44:14.751846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.751710 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" Apr 22 14:44:14.751846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.751709 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" event={"ID":"d9274be9-14a0-4079-940c-0b45c6502132","Type":"ContainerDied","Data":"5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60"} Apr 22 14:44:14.751846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.751756 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq" event={"ID":"d9274be9-14a0-4079-940c-0b45c6502132","Type":"ContainerDied","Data":"1a94f708dce4207ec7e9f8ff8d07119d1ebe876617afd6a2717dc88c2c4ca925"} Apr 22 14:44:14.751846 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.751776 2562 scope.go:117] "RemoveContainer" containerID="5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60" Apr 22 14:44:14.753011 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.752995 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-c58f8844b-v8blc_30c3a348-63cb-46e0-bbb1-dd9296f1d5df/main/0.log" Apr 22 14:44:14.753674 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.753637 2562 generic.go:358] "Generic (PLEG): container finished" podID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerID="cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8" exitCode=137 Apr 22 14:44:14.753780 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.753678 2562 generic.go:358] "Generic (PLEG): container finished" podID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerID="2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b" exitCode=0 Apr 22 14:44:14.753780 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.753705 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" event={"ID":"30c3a348-63cb-46e0-bbb1-dd9296f1d5df","Type":"ContainerDied","Data":"cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8"} Apr 22 14:44:14.753780 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.753745 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" Apr 22 14:44:14.753780 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.753750 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" event={"ID":"30c3a348-63cb-46e0-bbb1-dd9296f1d5df","Type":"ContainerDied","Data":"2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b"} Apr 22 14:44:14.753780 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.753767 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc" event={"ID":"30c3a348-63cb-46e0-bbb1-dd9296f1d5df","Type":"ContainerDied","Data":"b8057d851d29951217ba7c863209a9b38f7f019dcf2d9910aae811b6f2c89f13"} Apr 22 14:44:14.764773 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.764740 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "30c3a348-63cb-46e0-bbb1-dd9296f1d5df" (UID: "30c3a348-63cb-46e0-bbb1-dd9296f1d5df"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:14.774740 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.774715 2562 scope.go:117] "RemoveContainer" containerID="e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30" Apr 22 14:44:14.798664 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.798626 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-model-cache\") pod \"d9274be9-14a0-4079-940c-0b45c6502132\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " Apr 22 14:44:14.798800 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.798735 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9bvb\" (UniqueName: \"kubernetes.io/projected/d9274be9-14a0-4079-940c-0b45c6502132-kube-api-access-g9bvb\") pod \"d9274be9-14a0-4079-940c-0b45c6502132\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " Apr 22 14:44:14.798800 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.798777 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9274be9-14a0-4079-940c-0b45c6502132-tls-certs\") pod \"d9274be9-14a0-4079-940c-0b45c6502132\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " Apr 22 14:44:14.798920 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.798872 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-model-cache" (OuterVolumeSpecName: "model-cache") pod "d9274be9-14a0-4079-940c-0b45c6502132" (UID: "d9274be9-14a0-4079-940c-0b45c6502132"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:14.798976 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.798934 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-kserve-provision-location\") pod \"d9274be9-14a0-4079-940c-0b45c6502132\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " Apr 22 14:44:14.799034 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.798973 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-dshm\") pod \"d9274be9-14a0-4079-940c-0b45c6502132\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " Apr 22 14:44:14.799034 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.799000 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-home\") pod \"d9274be9-14a0-4079-940c-0b45c6502132\" (UID: \"d9274be9-14a0-4079-940c-0b45c6502132\") " Apr 22 14:44:14.799362 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.799343 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.799503 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.799489 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-model-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.799603 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.799584 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.799748 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.799606 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vlzk5\" (UniqueName: \"kubernetes.io/projected/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-kube-api-access-vlzk5\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.799748 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.799620 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30c3a348-63cb-46e0-bbb1-dd9296f1d5df-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.799748 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.799684 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-home" (OuterVolumeSpecName: "home") pod "d9274be9-14a0-4079-940c-0b45c6502132" (UID: "d9274be9-14a0-4079-940c-0b45c6502132"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:14.800876 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.800849 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9274be9-14a0-4079-940c-0b45c6502132-kube-api-access-g9bvb" (OuterVolumeSpecName: "kube-api-access-g9bvb") pod "d9274be9-14a0-4079-940c-0b45c6502132" (UID: "d9274be9-14a0-4079-940c-0b45c6502132"). InnerVolumeSpecName "kube-api-access-g9bvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:44:14.800961 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.800941 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9274be9-14a0-4079-940c-0b45c6502132-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d9274be9-14a0-4079-940c-0b45c6502132" (UID: "d9274be9-14a0-4079-940c-0b45c6502132"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:44:14.801245 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.801227 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-dshm" (OuterVolumeSpecName: "dshm") pod "d9274be9-14a0-4079-940c-0b45c6502132" (UID: "d9274be9-14a0-4079-940c-0b45c6502132"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:14.847179 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.847156 2562 scope.go:117] "RemoveContainer" containerID="5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60" Apr 22 14:44:14.847527 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:44:14.847502 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60\": container with ID starting with 5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60 not found: ID does not exist" containerID="5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60" Apr 22 14:44:14.847626 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.847537 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60"} err="failed to get container status \"5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60\": rpc error: code = NotFound desc = could not find container \"5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60\": container with ID starting with 5ccaafdd070f6e118f3480dec0d1e66d7c92f9f80ea62c1ba59e5188d0951f60 not found: ID does not exist" Apr 22 14:44:14.847626 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.847559 2562 scope.go:117] "RemoveContainer" containerID="e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30" Apr 22 14:44:14.847862 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:44:14.847838 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30\": container with ID starting with e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30 not found: ID does not exist" containerID="e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30" Apr 22 14:44:14.847941 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.847871 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30"} err="failed to get container status \"e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30\": rpc error: code = NotFound desc = could not find container \"e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30\": container with ID starting with e575efd7fa739456b48b777e60872e363fcffa195af78e8ace7a2d0ca463bb30 not found: ID does not exist" Apr 22 14:44:14.847941 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.847897 2562 scope.go:117] "RemoveContainer" containerID="cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8" Apr 22 14:44:14.862567 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.862538 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d9274be9-14a0-4079-940c-0b45c6502132" (UID: "d9274be9-14a0-4079-940c-0b45c6502132"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:14.867084 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.867062 2562 scope.go:117] "RemoveContainer" containerID="62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a" Apr 22 14:44:14.900943 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.900906 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9bvb\" (UniqueName: \"kubernetes.io/projected/d9274be9-14a0-4079-940c-0b45c6502132-kube-api-access-g9bvb\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.900943 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.900940 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9274be9-14a0-4079-940c-0b45c6502132-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.900943 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.900952 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.901179 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.900960 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-dshm\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.901179 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.900970 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9274be9-14a0-4079-940c-0b45c6502132-home\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:14.934458 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.934434 2562 scope.go:117] "RemoveContainer" containerID="2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b" Apr 22 14:44:14.943447 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.943428 2562 scope.go:117] "RemoveContainer" containerID="cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8" Apr 22 14:44:14.943723 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:44:14.943702 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8\": container with ID starting with cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8 not found: ID does not exist" containerID="cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8" Apr 22 14:44:14.943810 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.943731 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8"} err="failed to get container status \"cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8\": rpc error: code = NotFound desc = could not find container \"cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8\": container with ID starting with cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8 not found: ID does not exist" Apr 22 14:44:14.943810 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.943750 2562 scope.go:117] "RemoveContainer" containerID="62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a" Apr 22 14:44:14.943983 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:44:14.943967 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a\": container with ID starting with 62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a not found: ID does not exist" containerID="62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a" Apr 22 14:44:14.944028 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.943988 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a"} err="failed to get container status \"62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a\": rpc error: code = NotFound desc = could not find container \"62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a\": container with ID starting with 62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a not found: ID does not exist" Apr 22 14:44:14.944028 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.944005 2562 scope.go:117] "RemoveContainer" containerID="2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b" Apr 22 14:44:14.944227 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:44:14.944211 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b\": container with ID starting with 2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b not found: ID does not exist" containerID="2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b" Apr 22 14:44:14.944274 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.944231 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b"} err="failed to get container status \"2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b\": rpc error: code = NotFound desc = could not find container \"2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b\": container with ID starting with 2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b not found: ID does not exist" Apr 22 14:44:14.944274 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.944244 2562 scope.go:117] "RemoveContainer" containerID="cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8" Apr 22 14:44:14.944425 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.944410 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8"} err="failed to get container status \"cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8\": rpc error: code = NotFound desc = could not find container \"cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8\": container with ID starting with cce18d6d218a63476f9ea7bd89a2f324f157cfd0298af9d6c454958c6e1dd1c8 not found: ID does not exist" Apr 22 14:44:14.944470 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.944425 2562 scope.go:117] "RemoveContainer" containerID="62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a" Apr 22 14:44:14.944626 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.944607 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a"} err="failed to get container status \"62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a\": rpc error: code = NotFound desc = could not find container \"62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a\": container with ID starting with 62150d34674c9155e1c39b0a7ced9ab43b8d3146358380b5eae5db70cd78727a not found: ID does not exist" Apr 22 14:44:14.944687 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.944628 2562 scope.go:117] "RemoveContainer" containerID="2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b" Apr 22 14:44:14.944842 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:14.944827 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b"} err="failed to get container status \"2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b\": rpc error: code = NotFound desc = could not find container \"2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b\": container with ID starting with 2095b477137bb9fffdfaf75a09571f436e90f39752ff2b580bb79c01664aa97b not found: ID does not exist" Apr 22 14:44:15.072469 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:15.072436 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc"] Apr 22 14:44:15.077603 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:15.077578 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-c58f8844b-v8blc"] Apr 22 14:44:15.091490 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:15.091457 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq"] Apr 22 14:44:15.094770 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:15.094741 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b9b967f79-rf2cq"] Apr 22 14:44:16.925008 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:16.924972 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" path="/var/lib/kubelet/pods/30c3a348-63cb-46e0-bbb1-dd9296f1d5df/volumes" Apr 22 14:44:16.925440 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:16.925425 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9274be9-14a0-4079-940c-0b45c6502132" path="/var/lib/kubelet/pods/d9274be9-14a0-4079-940c-0b45c6502132/volumes" Apr 22 14:44:22.112680 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:22.112624 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls"] Apr 22 14:44:22.113197 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:22.112947 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="main" containerID="cri-o://299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3" gracePeriod=30 Apr 22 14:44:22.113197 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:22.113010 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="tokenizer" containerID="cri-o://fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56" gracePeriod=30 Apr 22 14:44:22.666573 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:44:22.666542 2562 logging.go:55] [core] [Channel #60 SubChannel #61]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.56:9003", ServerName: "10.133.0.56:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.56:9003: connect: connection refused" Apr 22 14:44:22.788995 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:22.788955 2562 generic.go:358] "Generic (PLEG): container finished" podID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerID="299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3" exitCode=0 Apr 22 14:44:22.789198 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:22.789030 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" event={"ID":"ab098998-0b47-4cde-8c41-4ec4474e82c8","Type":"ContainerDied","Data":"299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3"} Apr 22 14:44:23.375292 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.375270 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:44:23.473144 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473052 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-cache\") pod \"ab098998-0b47-4cde-8c41-4ec4474e82c8\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " Apr 22 14:44:23.473144 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473112 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-kserve-provision-location\") pod \"ab098998-0b47-4cde-8c41-4ec4474e82c8\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " Apr 22 14:44:23.473144 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473141 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkj75\" (UniqueName: \"kubernetes.io/projected/ab098998-0b47-4cde-8c41-4ec4474e82c8-kube-api-access-kkj75\") pod \"ab098998-0b47-4cde-8c41-4ec4474e82c8\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " Apr 22 14:44:23.473436 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473182 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab098998-0b47-4cde-8c41-4ec4474e82c8-tls-certs\") pod \"ab098998-0b47-4cde-8c41-4ec4474e82c8\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " Apr 22 14:44:23.473436 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473204 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-tmp\") pod \"ab098998-0b47-4cde-8c41-4ec4474e82c8\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " Apr 22 14:44:23.473436 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473333 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-uds\") pod \"ab098998-0b47-4cde-8c41-4ec4474e82c8\" (UID: \"ab098998-0b47-4cde-8c41-4ec4474e82c8\") " Apr 22 14:44:23.473436 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473339 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ab098998-0b47-4cde-8c41-4ec4474e82c8" (UID: "ab098998-0b47-4cde-8c41-4ec4474e82c8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:23.473676 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473544 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ab098998-0b47-4cde-8c41-4ec4474e82c8" (UID: "ab098998-0b47-4cde-8c41-4ec4474e82c8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:23.473676 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473565 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ab098998-0b47-4cde-8c41-4ec4474e82c8" (UID: "ab098998-0b47-4cde-8c41-4ec4474e82c8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:23.473796 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473707 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-tmp\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:23.473796 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473725 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-uds\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:23.473796 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473740 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-tokenizer-cache\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:23.474004 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.473978 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ab098998-0b47-4cde-8c41-4ec4474e82c8" (UID: "ab098998-0b47-4cde-8c41-4ec4474e82c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:23.475273 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.475253 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab098998-0b47-4cde-8c41-4ec4474e82c8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ab098998-0b47-4cde-8c41-4ec4474e82c8" (UID: "ab098998-0b47-4cde-8c41-4ec4474e82c8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:44:23.475346 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.475329 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab098998-0b47-4cde-8c41-4ec4474e82c8-kube-api-access-kkj75" (OuterVolumeSpecName: "kube-api-access-kkj75") pod "ab098998-0b47-4cde-8c41-4ec4474e82c8" (UID: "ab098998-0b47-4cde-8c41-4ec4474e82c8"). InnerVolumeSpecName "kube-api-access-kkj75". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:44:23.575007 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.574967 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab098998-0b47-4cde-8c41-4ec4474e82c8-kserve-provision-location\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:23.575007 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.575000 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kkj75\" (UniqueName: \"kubernetes.io/projected/ab098998-0b47-4cde-8c41-4ec4474e82c8-kube-api-access-kkj75\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:23.575007 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.575011 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab098998-0b47-4cde-8c41-4ec4474e82c8-tls-certs\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:44:23.666944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.666896 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.56:9003\" within 1s: context deadline exceeded" Apr 22 14:44:23.794876 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.794785 2562 generic.go:358] "Generic (PLEG): container finished" podID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerID="fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56" exitCode=0 Apr 22 14:44:23.794876 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.794867 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" Apr 22 14:44:23.795091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.794866 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" event={"ID":"ab098998-0b47-4cde-8c41-4ec4474e82c8","Type":"ContainerDied","Data":"fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56"} Apr 22 14:44:23.795091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.794982 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls" event={"ID":"ab098998-0b47-4cde-8c41-4ec4474e82c8","Type":"ContainerDied","Data":"f8839401b69d57e32bf71befb39b122d5f018c04dcbb86010ee83a1c219128ee"} Apr 22 14:44:23.795091 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.794998 2562 scope.go:117] "RemoveContainer" containerID="fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56" Apr 22 14:44:23.804250 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.804235 2562 scope.go:117] "RemoveContainer" containerID="299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3" Apr 22 14:44:23.816778 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.816735 2562 scope.go:117] "RemoveContainer" containerID="e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291" Apr 22 14:44:23.819339 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.819317 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls"] Apr 22 14:44:23.822729 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.822705 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-8574822pls"] Apr 22 14:44:23.825765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.825686 2562 scope.go:117] "RemoveContainer" containerID="fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56" Apr 22 14:44:23.825977 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:44:23.825959 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56\": container with ID starting with fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56 not found: ID does not exist" containerID="fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56" Apr 22 14:44:23.826037 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.825984 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56"} err="failed to get container status \"fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56\": rpc error: code = NotFound desc = could not find container \"fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56\": container with ID starting with fcfc770f8ca782ffb819ae9ca9cb3bdefcc5deaf4372251e384b1fedfbe07f56 not found: ID does not exist" Apr 22 14:44:23.826037 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.826005 2562 scope.go:117] "RemoveContainer" containerID="299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3" Apr 22 14:44:23.826318 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:44:23.826291 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3\": container with ID starting with 299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3 not found: ID does not exist" containerID="299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3" Apr 22 14:44:23.826380 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.826326 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3"} err="failed to get container status \"299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3\": rpc error: code = NotFound desc = could not find container \"299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3\": container with ID starting with 299639d3ee3320f488769f39f3f8eef044893c2220854fcf4ff3efe2be4dc6e3 not found: ID does not exist" Apr 22 14:44:23.826380 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.826343 2562 scope.go:117] "RemoveContainer" containerID="e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291" Apr 22 14:44:23.826596 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:44:23.826579 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291\": container with ID starting with e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291 not found: ID does not exist" containerID="e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291" Apr 22 14:44:23.826676 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:23.826601 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291"} err="failed to get container status \"e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291\": rpc error: code = NotFound desc = could not find container \"e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291\": container with ID starting with e111fd6418bcfc121ab1b49b13f4138c4390fafec8cc9d992a5dc0adbb1c9291 not found: ID does not exist" Apr 22 14:44:24.925469 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:24.925432 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" path="/var/lib/kubelet/pods/ab098998-0b47-4cde-8c41-4ec4474e82c8/volumes" Apr 22 14:44:56.973618 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:56.973588 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:44:56.980698 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:44:56.980674 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:46:51.273185 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273149 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wtwck/must-gather-hr92f"] Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273515 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="storage-initializer" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273529 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="storage-initializer" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273539 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273545 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273557 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="main" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273563 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="main" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273568 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="storage-initializer" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273573 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="storage-initializer" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273578 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="tokenizer" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273582 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="tokenizer" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273590 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="storage-initializer" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273595 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="storage-initializer" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273601 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="llm-d-routing-sidecar" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273606 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="llm-d-routing-sidecar" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273613 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" Apr 22 14:46:51.273610 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273618 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" Apr 22 14:46:51.274126 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273681 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="main" Apr 22 14:46:51.274126 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273692 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="main" Apr 22 14:46:51.274126 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273701 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab098998-0b47-4cde-8c41-4ec4474e82c8" containerName="tokenizer" Apr 22 14:46:51.274126 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273707 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="30c3a348-63cb-46e0-bbb1-dd9296f1d5df" containerName="llm-d-routing-sidecar" Apr 22 14:46:51.274126 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.273713 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9274be9-14a0-4079-940c-0b45c6502132" containerName="main" Apr 22 14:46:51.276676 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.276638 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:46:51.280855 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.280834 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wtwck\"/\"kube-root-ca.crt\"" Apr 22 14:46:51.281023 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.280850 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wtwck\"/\"default-dockercfg-47zqb\"" Apr 22 14:46:51.281112 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.280869 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wtwck\"/\"openshift-service-ca.crt\"" Apr 22 14:46:51.283717 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.283694 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wtwck/must-gather-hr92f"] Apr 22 14:46:51.376853 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.376812 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9d2a944-1254-4bcc-9719-148bf5236e74-must-gather-output\") pod \"must-gather-hr92f\" (UID: \"b9d2a944-1254-4bcc-9719-148bf5236e74\") " pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:46:51.377031 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.376864 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqpl\" (UniqueName: \"kubernetes.io/projected/b9d2a944-1254-4bcc-9719-148bf5236e74-kube-api-access-gfqpl\") pod \"must-gather-hr92f\" (UID: \"b9d2a944-1254-4bcc-9719-148bf5236e74\") " pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:46:51.477856 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.477816 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9d2a944-1254-4bcc-9719-148bf5236e74-must-gather-output\") pod \"must-gather-hr92f\" (UID: \"b9d2a944-1254-4bcc-9719-148bf5236e74\") " pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:46:51.477856 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.477863 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqpl\" (UniqueName: \"kubernetes.io/projected/b9d2a944-1254-4bcc-9719-148bf5236e74-kube-api-access-gfqpl\") pod \"must-gather-hr92f\" (UID: \"b9d2a944-1254-4bcc-9719-148bf5236e74\") " pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:46:51.478234 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.478208 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9d2a944-1254-4bcc-9719-148bf5236e74-must-gather-output\") pod \"must-gather-hr92f\" (UID: \"b9d2a944-1254-4bcc-9719-148bf5236e74\") " pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:46:51.486753 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.486729 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqpl\" (UniqueName: \"kubernetes.io/projected/b9d2a944-1254-4bcc-9719-148bf5236e74-kube-api-access-gfqpl\") pod \"must-gather-hr92f\" (UID: \"b9d2a944-1254-4bcc-9719-148bf5236e74\") " pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:46:51.587639 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.587607 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:46:51.708122 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.708093 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wtwck/must-gather-hr92f"] Apr 22 14:46:51.709947 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:46:51.709915 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9d2a944_1254_4bcc_9719_148bf5236e74.slice/crio-39761762217befe9aeffa55dfb14ebb619757948a361b3bc34be56350b99b746 WatchSource:0}: Error finding container 39761762217befe9aeffa55dfb14ebb619757948a361b3bc34be56350b99b746: Status 404 returned error can't find the container with id 39761762217befe9aeffa55dfb14ebb619757948a361b3bc34be56350b99b746 Apr 22 14:46:51.711620 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:51.711600 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:46:52.346344 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:52.346307 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wtwck/must-gather-hr92f" event={"ID":"b9d2a944-1254-4bcc-9719-148bf5236e74","Type":"ContainerStarted","Data":"39761762217befe9aeffa55dfb14ebb619757948a361b3bc34be56350b99b746"} Apr 22 14:46:57.371179 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:57.371136 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wtwck/must-gather-hr92f" event={"ID":"b9d2a944-1254-4bcc-9719-148bf5236e74","Type":"ContainerStarted","Data":"a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e"} Apr 22 14:46:57.371179 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:57.371183 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wtwck/must-gather-hr92f" event={"ID":"b9d2a944-1254-4bcc-9719-148bf5236e74","Type":"ContainerStarted","Data":"9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45"} Apr 22 14:46:57.391875 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:46:57.391765 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wtwck/must-gather-hr92f" podStartSLOduration=1.224775472 podStartE2EDuration="6.391744518s" podCreationTimestamp="2026-04-22 14:46:51 +0000 UTC" firstStartedPulling="2026-04-22 14:46:51.711739797 +0000 UTC m=+1915.411100456" lastFinishedPulling="2026-04-22 14:46:56.878708842 +0000 UTC m=+1920.578069502" observedRunningTime="2026-04-22 14:46:57.388644958 +0000 UTC m=+1921.088005678" watchObservedRunningTime="2026-04-22 14:46:57.391744518 +0000 UTC m=+1921.091105202" Apr 22 14:47:07.085725 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:07.085687 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:07.116855 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:07.116819 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:08.214918 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:08.214886 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:08.233058 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:08.233030 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:09.348442 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:09.348408 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:09.368626 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:09.368595 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:10.438344 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:10.438310 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:10.455151 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:10.455116 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:11.486042 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:11.486009 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:11.503512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:11.503483 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:12.539126 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:12.539095 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:12.557956 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:12.557926 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:13.587961 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:13.587928 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:13.605581 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:13.605544 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:14.657910 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:14.657881 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:14.683881 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:14.683852 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:15.751992 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:15.751954 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:15.777839 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:15.777805 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:16.812912 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:16.812883 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:16.841067 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:16.841036 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:17.905808 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:17.905779 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:17.923555 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:17.923519 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:19.003333 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:19.003301 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:19.021865 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:19.021839 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:20.033894 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:20.033858 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:20.051268 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:20.051234 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:21.058613 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:21.058582 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-k9fch_3083646c-cb86-490f-b489-adfa24221e89/istio-proxy/0.log" Apr 22 14:47:21.093148 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:21.093118 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-jzrxz_35510640-4170-4d27-813b-866bd7f36d55/istio-proxy/0.log" Apr 22 14:47:22.120618 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:22.120582 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-vs482_89666a12-f26d-461b-a935-b30133ba67c1/istio-proxy/0.log" Apr 22 14:47:22.984942 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:22.984909 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-vs482_89666a12-f26d-461b-a935-b30133ba67c1/istio-proxy/0.log" Apr 22 14:47:23.859880 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:23.859842 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-dcrzg_97ac6cd5-d3a8-48eb-a86e-fcebb2a78549/manager/0.log" Apr 22 14:47:23.929512 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:23.929486 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-fhq85_dd91f939-be95-434d-b3a6-d197f8ea66fb/manager/0.log" Apr 22 14:47:23.979206 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:23.979177 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-85cgp_800b1787-62f8-4297-a35c-9bab9e7e7c78/limitador/0.log" Apr 22 14:47:25.480957 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:25.480921 2562 generic.go:358] "Generic (PLEG): container finished" podID="b9d2a944-1254-4bcc-9719-148bf5236e74" containerID="9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45" exitCode=0 Apr 22 14:47:25.481380 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:25.480992 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wtwck/must-gather-hr92f" event={"ID":"b9d2a944-1254-4bcc-9719-148bf5236e74","Type":"ContainerDied","Data":"9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45"} Apr 22 14:47:25.481380 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:25.481302 2562 scope.go:117] "RemoveContainer" containerID="9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45" Apr 22 14:47:25.715789 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:25.715758 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wtwck_must-gather-hr92f_b9d2a944-1254-4bcc-9719-148bf5236e74/gather/0.log" Apr 22 14:47:29.631457 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:29.631417 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kcb7w_1b5088f9-42ba-4937-95a9-db3577261f8f/global-pull-secret-syncer/0.log" Apr 22 14:47:29.770445 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:29.770407 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fwhsq_cc6398d2-2767-495e-b2e8-7f3f713e5a31/konnectivity-agent/0.log" Apr 22 14:47:29.829515 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:29.829488 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-31.ec2.internal_8afe0898337b196494e7d612c6fa17e1/haproxy/0.log" Apr 22 14:47:31.218343 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.218309 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wtwck/must-gather-hr92f"] Apr 22 14:47:31.218824 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.218589 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-wtwck/must-gather-hr92f" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" containerName="copy" containerID="cri-o://a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e" gracePeriod=2 Apr 22 14:47:31.221050 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.221017 2562 status_manager.go:895] "Failed to get status for pod" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" pod="openshift-must-gather-wtwck/must-gather-hr92f" err="pods \"must-gather-hr92f\" is forbidden: User \"system:node:ip-10-0-133-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wtwck\": no relationship found between node 'ip-10-0-133-31.ec2.internal' and this object" Apr 22 14:47:31.222107 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.222077 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wtwck/must-gather-hr92f"] Apr 22 14:47:31.452195 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.452172 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wtwck_must-gather-hr92f_b9d2a944-1254-4bcc-9719-148bf5236e74/copy/0.log" Apr 22 14:47:31.452497 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.452482 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:47:31.454897 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.454863 2562 status_manager.go:895] "Failed to get status for pod" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" pod="openshift-must-gather-wtwck/must-gather-hr92f" err="pods \"must-gather-hr92f\" is forbidden: User \"system:node:ip-10-0-133-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wtwck\": no relationship found between node 'ip-10-0-133-31.ec2.internal' and this object" Apr 22 14:47:31.506149 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.506058 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wtwck_must-gather-hr92f_b9d2a944-1254-4bcc-9719-148bf5236e74/copy/0.log" Apr 22 14:47:31.506435 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.506410 2562 generic.go:358] "Generic (PLEG): container finished" podID="b9d2a944-1254-4bcc-9719-148bf5236e74" containerID="a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e" exitCode=143 Apr 22 14:47:31.506517 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.506468 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wtwck/must-gather-hr92f" Apr 22 14:47:31.506578 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.506525 2562 scope.go:117] "RemoveContainer" containerID="a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e" Apr 22 14:47:31.509265 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.509239 2562 status_manager.go:895] "Failed to get status for pod" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" pod="openshift-must-gather-wtwck/must-gather-hr92f" err="pods \"must-gather-hr92f\" is forbidden: User \"system:node:ip-10-0-133-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wtwck\": no relationship found between node 'ip-10-0-133-31.ec2.internal' and this object" Apr 22 14:47:31.514510 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.514492 2562 scope.go:117] "RemoveContainer" containerID="9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45" Apr 22 14:47:31.529718 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.529696 2562 scope.go:117] "RemoveContainer" containerID="a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e" Apr 22 14:47:31.530027 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:47:31.530009 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e\": container with ID starting with a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e not found: ID does not exist" containerID="a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e" Apr 22 14:47:31.530076 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.530041 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e"} err="failed to get container status \"a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e\": rpc error: code = NotFound desc = could not find container \"a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e\": container with ID starting with a3375604dfafa6542cbc7e4ead19943f3662cff5c8ff14423337c3d43aa8c62e not found: ID does not exist" Apr 22 14:47:31.530076 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.530060 2562 scope.go:117] "RemoveContainer" containerID="9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45" Apr 22 14:47:31.530295 ip-10-0-133-31 kubenswrapper[2562]: E0422 14:47:31.530273 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45\": container with ID starting with 9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45 not found: ID does not exist" containerID="9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45" Apr 22 14:47:31.530338 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.530303 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45"} err="failed to get container status \"9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45\": rpc error: code = NotFound desc = could not find container \"9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45\": container with ID starting with 9c627c851ffff4c2fd54b527314545dcaf2d94495ae1deef55ee940a92f1da45 not found: ID does not exist" Apr 22 14:47:31.541589 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.541560 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9d2a944-1254-4bcc-9719-148bf5236e74-must-gather-output\") pod \"b9d2a944-1254-4bcc-9719-148bf5236e74\" (UID: \"b9d2a944-1254-4bcc-9719-148bf5236e74\") " Apr 22 14:47:31.541686 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.541636 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfqpl\" (UniqueName: \"kubernetes.io/projected/b9d2a944-1254-4bcc-9719-148bf5236e74-kube-api-access-gfqpl\") pod \"b9d2a944-1254-4bcc-9719-148bf5236e74\" (UID: \"b9d2a944-1254-4bcc-9719-148bf5236e74\") " Apr 22 14:47:31.543768 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.543733 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d2a944-1254-4bcc-9719-148bf5236e74-kube-api-access-gfqpl" (OuterVolumeSpecName: "kube-api-access-gfqpl") pod "b9d2a944-1254-4bcc-9719-148bf5236e74" (UID: "b9d2a944-1254-4bcc-9719-148bf5236e74"). InnerVolumeSpecName "kube-api-access-gfqpl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:47:31.547570 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.547543 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9d2a944-1254-4bcc-9719-148bf5236e74-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b9d2a944-1254-4bcc-9719-148bf5236e74" (UID: "b9d2a944-1254-4bcc-9719-148bf5236e74"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:47:31.643270 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.643238 2562 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9d2a944-1254-4bcc-9719-148bf5236e74-must-gather-output\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:47:31.643270 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.643266 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gfqpl\" (UniqueName: \"kubernetes.io/projected/b9d2a944-1254-4bcc-9719-148bf5236e74-kube-api-access-gfqpl\") on node \"ip-10-0-133-31.ec2.internal\" DevicePath \"\"" Apr 22 14:47:31.819141 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:31.819106 2562 status_manager.go:895] "Failed to get status for pod" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" pod="openshift-must-gather-wtwck/must-gather-hr92f" err="pods \"must-gather-hr92f\" is forbidden: User \"system:node:ip-10-0-133-31.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wtwck\": no relationship found between node 'ip-10-0-133-31.ec2.internal' and this object" Apr 22 14:47:32.925316 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:32.925280 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" path="/var/lib/kubelet/pods/b9d2a944-1254-4bcc-9719-148bf5236e74/volumes" Apr 22 14:47:33.964587 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:33.964553 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-dcrzg_97ac6cd5-d3a8-48eb-a86e-fcebb2a78549/manager/0.log" Apr 22 14:47:34.080593 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:34.080550 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-fhq85_dd91f939-be95-434d-b3a6-d197f8ea66fb/manager/0.log" Apr 22 14:47:34.107824 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:34.107785 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-85cgp_800b1787-62f8-4297-a35c-9bab9e7e7c78/limitador/0.log" Apr 22 14:47:35.403289 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:35.403260 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6lbp8_d3fa9b97-6462-4371-bd9c-fbfb49153cf7/node-exporter/0.log" Apr 22 14:47:35.434627 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:35.434604 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6lbp8_d3fa9b97-6462-4371-bd9c-fbfb49153cf7/kube-rbac-proxy/0.log" Apr 22 14:47:35.454301 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:35.454273 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6lbp8_d3fa9b97-6462-4371-bd9c-fbfb49153cf7/init-textfile/0.log" Apr 22 14:47:35.953866 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:35.953837 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74b7dc4dd5-42lxp_0038ac9f-6cc0-4497-8891-cb1708f6d62d/telemeter-client/0.log" Apr 22 14:47:35.977340 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:35.977313 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74b7dc4dd5-42lxp_0038ac9f-6cc0-4497-8891-cb1708f6d62d/reload/0.log" Apr 22 14:47:36.010496 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:36.010462 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74b7dc4dd5-42lxp_0038ac9f-6cc0-4497-8891-cb1708f6d62d/kube-rbac-proxy/0.log" Apr 22 14:47:38.753288 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.753251 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9"] Apr 22 14:47:38.753765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.753594 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" containerName="copy" Apr 22 14:47:38.753765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.753604 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" containerName="copy" Apr 22 14:47:38.753765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.753622 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" containerName="gather" Apr 22 14:47:38.753765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.753627 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" containerName="gather" Apr 22 14:47:38.753765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.753700 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" containerName="copy" Apr 22 14:47:38.753765 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.753710 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9d2a944-1254-4bcc-9719-148bf5236e74" containerName="gather" Apr 22 14:47:38.759931 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.759914 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:38.763500 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.763477 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-54rh7\"/\"default-dockercfg-4zg56\"" Apr 22 14:47:38.763622 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.763511 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-54rh7\"/\"openshift-service-ca.crt\"" Apr 22 14:47:38.763622 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.763570 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-54rh7\"/\"kube-root-ca.crt\"" Apr 22 14:47:38.767208 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.767178 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9"] Apr 22 14:47:38.911901 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.911861 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdnrv\" (UniqueName: \"kubernetes.io/projected/926455ad-8861-49ad-8455-4e429dace250-kube-api-access-mdnrv\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:38.912073 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.911946 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-podres\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:38.912073 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.911993 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-proc\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:38.912073 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.912019 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-sys\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:38.912073 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:38.912041 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-lib-modules\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.012564 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.012462 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-sys\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.012564 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.012507 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-lib-modules\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.012564 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.012556 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdnrv\" (UniqueName: \"kubernetes.io/projected/926455ad-8861-49ad-8455-4e429dace250-kube-api-access-mdnrv\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.012892 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.012599 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-sys\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.012892 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.012632 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-podres\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.012892 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.012717 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-lib-modules\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.012892 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.012800 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-proc\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.013078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.012965 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-proc\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.013078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.013048 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/926455ad-8861-49ad-8455-4e429dace250-podres\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.021553 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.021523 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdnrv\" (UniqueName: \"kubernetes.io/projected/926455ad-8861-49ad-8455-4e429dace250-kube-api-access-mdnrv\") pod \"perf-node-gather-daemonset-qv5v9\" (UID: \"926455ad-8861-49ad-8455-4e429dace250\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.077672 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.077607 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.203764 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.203738 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9"] Apr 22 14:47:39.205280 ip-10-0-133-31 kubenswrapper[2562]: W0422 14:47:39.205253 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod926455ad_8861_49ad_8455_4e429dace250.slice/crio-7cfa7205c8bdf8c215704835a9ffce4325bb29f0b58c19b84fdf2edb7f33589a WatchSource:0}: Error finding container 7cfa7205c8bdf8c215704835a9ffce4325bb29f0b58c19b84fdf2edb7f33589a: Status 404 returned error can't find the container with id 7cfa7205c8bdf8c215704835a9ffce4325bb29f0b58c19b84fdf2edb7f33589a Apr 22 14:47:39.541829 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.541793 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" event={"ID":"926455ad-8861-49ad-8455-4e429dace250","Type":"ContainerStarted","Data":"049c6b5f5119e9054baee538fb09f572e30b0adf22771dac418f83341db85fa7"} Apr 22 14:47:39.541829 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.541831 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" event={"ID":"926455ad-8861-49ad-8455-4e429dace250","Type":"ContainerStarted","Data":"7cfa7205c8bdf8c215704835a9ffce4325bb29f0b58c19b84fdf2edb7f33589a"} Apr 22 14:47:39.542080 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.541855 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:39.561235 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.561134 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" podStartSLOduration=1.561118006 podStartE2EDuration="1.561118006s" podCreationTimestamp="2026-04-22 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:47:39.558676471 +0000 UTC m=+1963.258037146" watchObservedRunningTime="2026-04-22 14:47:39.561118006 +0000 UTC m=+1963.260478689" Apr 22 14:47:39.621934 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.621907 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7j877_3f2964c4-19d3-4dcc-b821-38a683bc38f7/dns/0.log" Apr 22 14:47:39.640925 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.640880 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7j877_3f2964c4-19d3-4dcc-b821-38a683bc38f7/kube-rbac-proxy/0.log" Apr 22 14:47:39.730934 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:39.730905 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jf64f_a50e9092-d980-437f-925d-016de60cc559/dns-node-resolver/0.log" Apr 22 14:47:40.270817 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:40.270788 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5bdc9fd6c5-p8cbs_5efc25d9-b6cb-482c-a5aa-bc50fea03e4f/registry/0.log" Apr 22 14:47:40.295078 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:40.295052 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9t7jt_1ec369a9-6fa7-4522-ab1c-257f1ae32b8d/node-ca/0.log" Apr 22 14:47:41.232736 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:41.232694 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-vs482_89666a12-f26d-461b-a935-b30133ba67c1/istio-proxy/0.log" Apr 22 14:47:41.698826 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:41.698791 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-frhw5_59e79879-c532-4a00-a584-9f807448ef98/serve-healthcheck-canary/0.log" Apr 22 14:47:42.281996 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:42.281964 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j76jp_65caa8fa-a4c2-4744-bb55-9df4683af02f/kube-rbac-proxy/0.log" Apr 22 14:47:42.301053 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:42.301018 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j76jp_65caa8fa-a4c2-4744-bb55-9df4683af02f/exporter/0.log" Apr 22 14:47:42.320450 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:42.320424 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j76jp_65caa8fa-a4c2-4744-bb55-9df4683af02f/extractor/0.log" Apr 22 14:47:45.363890 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:45.363809 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-66cf78b85b-9x4sc_1129478a-b9a3-4f2e-93d8-ec5a234a2054/manager/0.log" Apr 22 14:47:45.449502 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:45.449470 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-kz7zg_404f0edd-0396-444b-9c49-e2eef58db60b/server/0.log" Apr 22 14:47:45.557944 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:45.557905 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-qv5v9" Apr 22 14:47:45.713820 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:45.713721 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-wm5rq_060ac850-d845-4d76-ba56-b12612e9def6/manager/0.log" Apr 22 14:47:45.737202 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:45.737169 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-x8dpq_f69d9140-fa39-4ab0-a5e7-0b6fb00a4787/s3-init/0.log" Apr 22 14:47:45.764509 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:45.764482 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-clk5r_c3349d53-fc9e-49c8-899f-ce00642fb46b/seaweedfs/0.log" Apr 22 14:47:52.066684 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.066594 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8tc5r_25d005dc-d20d-43ae-bb7b-1c3a14bd5ddd/kube-multus/0.log" Apr 22 14:47:52.227260 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.227229 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sl5cl_0212ebfc-c697-40e1-8939-863a200bf32a/kube-multus-additional-cni-plugins/0.log" Apr 22 14:47:52.246167 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.246139 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sl5cl_0212ebfc-c697-40e1-8939-863a200bf32a/egress-router-binary-copy/0.log" Apr 22 14:47:52.265746 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.265723 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sl5cl_0212ebfc-c697-40e1-8939-863a200bf32a/cni-plugins/0.log" Apr 22 14:47:52.303898 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.303873 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sl5cl_0212ebfc-c697-40e1-8939-863a200bf32a/bond-cni-plugin/0.log" Apr 22 14:47:52.354449 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.354424 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sl5cl_0212ebfc-c697-40e1-8939-863a200bf32a/routeoverride-cni/0.log" Apr 22 14:47:52.423070 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.423033 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sl5cl_0212ebfc-c697-40e1-8939-863a200bf32a/whereabouts-cni-bincopy/0.log" Apr 22 14:47:52.453015 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.452987 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sl5cl_0212ebfc-c697-40e1-8939-863a200bf32a/whereabouts-cni/0.log" Apr 22 14:47:52.708208 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.708121 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7pz2p_db11d8cb-718e-49f4-a019-bc36f8a9af79/network-metrics-daemon/0.log" Apr 22 14:47:52.727317 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:52.727266 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7pz2p_db11d8cb-718e-49f4-a019-bc36f8a9af79/kube-rbac-proxy/0.log" Apr 22 14:47:54.415257 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:54.415225 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-controller/0.log" Apr 22 14:47:54.436437 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:54.436402 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/0.log" Apr 22 14:47:54.454664 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:54.454632 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovn-acl-logging/1.log" Apr 22 14:47:54.481670 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:54.481621 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/kube-rbac-proxy-node/0.log" Apr 22 14:47:54.504519 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:54.504492 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 14:47:54.524053 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:54.524028 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/northd/0.log" Apr 22 14:47:54.546342 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:54.546315 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/nbdb/0.log" Apr 22 14:47:54.566665 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:54.566601 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/sbdb/0.log" Apr 22 14:47:54.744671 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:54.744575 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k777w_524b05a6-377c-460c-a38e-359a1d04f304/ovnkube-controller/0.log" Apr 22 14:47:55.822771 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:55.822737 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-8kfb7_58472e4a-b808-4034-b26a-9ab40a3074ec/network-check-target-container/0.log" Apr 22 14:47:56.885768 ip-10-0-133-31 kubenswrapper[2562]: I0422 14:47:56.885738 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-z56xg_ad3bd840-a967-40e3-9669-959790f9dfb8/iptables-alerter/0.log"