Apr 17 07:49:18.313566 ip-10-0-137-8 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:49:18.313580 ip-10-0-137-8 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:49:18.313590 ip-10-0-137-8 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:49:18.313790 ip-10-0-137-8 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 07:49:28.482018 ip-10-0-137-8 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 07:49:28.482034 ip-10-0-137-8 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ff3cab9e972440c0b17194f7a17e49ff -- Apr 17 07:51:56.139102 ip-10-0-137-8 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:51:56.557618 ip-10-0-137-8 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:56.557618 ip-10-0-137-8 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:51:56.557618 ip-10-0-137-8 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:56.557618 ip-10-0-137-8 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:51:56.557618 ip-10-0-137-8 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:56.560778 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.560688 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:51:56.565411 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565393 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:56.565411 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565411 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565415 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565418 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565422 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565430 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565435 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565438 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565441 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565444 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565447 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565449 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565452 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565455 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565457 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565460 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565462 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565465 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565467 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565470 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:56.565482 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565472 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565475 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565478 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565481 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565484 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565486 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565489 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565492 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565494 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565497 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565500 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565503 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565505 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565508 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565511 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565514 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565516 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565519 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565522 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565524 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:56.565952 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565526 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565529 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565532 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565534 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565537 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565540 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565543 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565546 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565548 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565551 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565553 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565555 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565558 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565560 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565564 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565567 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565569 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565572 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565575 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565577 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:56.566425 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565580 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565583 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565585 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565588 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565590 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565593 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565596 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565598 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565601 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565603 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565606 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565609 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565611 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565614 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565616 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565619 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565621 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565624 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565626 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565629 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:56.566942 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565631 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565636 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565640 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565643 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565646 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.565649 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566067 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566072 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566075 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566079 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566081 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566084 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566087 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566090 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566092 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566095 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566097 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566100 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566102 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:56.567433 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566105 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566108 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566110 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566113 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566116 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566119 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566121 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566124 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566126 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566129 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566131 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566134 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566136 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566139 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566141 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566144 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566146 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566149 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566151 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:56.567911 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566155 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566158 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566161 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566163 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566166 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566168 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566171 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566173 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566175 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566178 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566181 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566183 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566186 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566188 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566191 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566194 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566197 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566200 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566203 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566205 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:56.568385 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566207 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566210 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566212 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566215 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566219 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566222 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566225 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566228 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566231 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566234 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566237 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566239 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566242 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566245 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566247 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566250 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566252 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566255 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566257 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:56.568892 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566259 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566262 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566264 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566267 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566269 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566272 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566274 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566277 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566281 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566284 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566286 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566289 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566292 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566295 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.566298 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567608 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567617 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567623 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567628 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567633 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567637 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:51:56.569356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567642 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567647 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567650 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567654 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567658 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567661 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567664 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567667 2573 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567671 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567674 2573 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567677 2573 flags.go:64] FLAG: --cloud-config="" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567679 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567682 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567687 2573 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567690 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567693 2573 flags.go:64] FLAG: --config-dir="" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567696 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567699 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567703 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567706 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567710 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567714 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567717 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567720 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:51:56.569872 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567724 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567727 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567730 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567735 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567738 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567741 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567745 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567748 2573 flags.go:64] FLAG: --enable-server="true" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567751 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567755 2573 flags.go:64] FLAG: --event-burst="100" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567758 2573 flags.go:64] FLAG: --event-qps="50" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567761 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567766 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567769 2573 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567773 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567776 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567779 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567799 2573 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567802 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567805 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567808 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567811 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567814 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567818 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567821 2573 flags.go:64] FLAG: --feature-gates="" Apr 17 07:51:56.570446 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567824 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567828 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567831 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567835 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567838 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567841 2573 flags.go:64] FLAG: --help="false" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567844 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567847 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567851 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567854 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567857 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567861 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567864 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567866 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567869 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567872 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567875 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567878 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567881 2573 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567884 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567887 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567890 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567893 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567896 2573 flags.go:64] FLAG: --lock-file="" Apr 17 07:51:56.571081 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567898 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567901 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567904 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567909 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567912 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567915 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567918 2573 flags.go:64] FLAG: --logging-format="text" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567921 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567925 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567927 2573 flags.go:64] FLAG: --manifest-url="" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567930 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567935 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567938 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567943 2573 flags.go:64] FLAG: --max-pods="110" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567946 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567949 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567952 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567956 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567959 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567962 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567965 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567979 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567982 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567985 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:51:56.571695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567988 2573 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567991 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.567998 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568001 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568004 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568007 2573 flags.go:64] FLAG: --port="10250" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568010 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568013 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-048efffde70fab106" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568016 2573 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568019 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568022 2573 flags.go:64] FLAG: --register-node="true" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568025 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568028 2573 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568032 2573 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568034 2573 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568037 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568040 2573 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568044 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568047 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568050 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568053 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568056 2573 flags.go:64] FLAG: --runonce="false" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568059 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568063 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568066 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:51:56.572291 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568069 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568073 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568076 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568080 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568083 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568086 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568089 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568092 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568095 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568098 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568100 2573 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568103 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568109 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568112 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568115 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568119 2573 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568121 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568124 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568127 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568130 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568133 2573 flags.go:64] FLAG: --v="2" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568138 2573 flags.go:64] FLAG: --version="false" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568142 2573 flags.go:64] FLAG: --vmodule="" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568146 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.568150 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:51:56.572957 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568284 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568288 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568291 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568295 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568298 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568301 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568304 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568306 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568310 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568313 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568315 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568318 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568320 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568323 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568325 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568328 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568330 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568333 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568335 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568338 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:56.573539 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568340 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568343 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568345 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568348 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568351 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568353 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568356 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568359 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568362 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568364 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568366 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568369 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568372 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568379 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568383 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568387 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568391 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568395 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568398 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568401 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:56.574092 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568404 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568406 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568410 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568413 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568415 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568418 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568421 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568423 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568427 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568431 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568433 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568436 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568439 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568442 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568445 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568447 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568450 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568453 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568456 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:56.574672 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568458 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568461 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568463 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568466 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568468 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568471 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568474 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568478 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568481 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568483 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568486 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568488 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568491 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568493 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568496 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568499 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568502 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568504 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568507 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568509 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:56.575535 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568512 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:56.576345 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568514 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:56.576345 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568517 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:56.576345 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568520 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:56.576345 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568522 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:56.576345 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568524 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:56.576345 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.568527 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:56.576345 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.569366 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:56.577781 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.577758 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:51:56.577839 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.577798 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:51:56.577870 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577860 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:56.577870 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577867 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577871 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577876 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577879 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577882 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577885 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577888 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577891 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577894 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577896 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577899 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577902 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577905 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577907 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577910 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577912 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577915 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577917 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577920 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577922 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:56.577930 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577925 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577928 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577930 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577934 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577937 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577940 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577943 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577946 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577948 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577951 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577954 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577956 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577958 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577961 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577964 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577966 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577969 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577971 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577974 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577976 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:56.578427 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577979 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577982 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577984 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577986 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577991 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577995 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.577998 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578001 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578004 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578006 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578009 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578012 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578014 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578017 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578021 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578024 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578026 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578029 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578032 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:56.578936 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578035 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578037 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578040 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578042 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578045 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578047 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578050 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578053 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578055 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578058 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578060 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578063 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578065 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578068 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578070 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578073 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578076 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578079 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578081 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578084 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:56.579415 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578086 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578089 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578091 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578094 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578097 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578099 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.578105 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578212 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578218 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578221 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578224 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578227 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578230 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578233 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578236 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:56.579949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578239 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578242 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578245 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578247 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578250 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578252 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578255 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578257 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578260 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578263 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578265 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578267 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578270 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578273 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578276 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578278 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578281 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578283 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578286 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578289 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:56.580319 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578292 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578294 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578296 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578299 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578302 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578305 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578307 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578311 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578313 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578315 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578318 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578321 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578323 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578326 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578328 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578331 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578333 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578336 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578338 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578341 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:56.580842 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578343 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578346 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578348 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578351 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578353 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578356 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578358 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578361 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578365 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578368 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578371 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578374 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578376 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578379 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578382 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578385 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578388 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578391 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578394 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578396 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:56.581326 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578399 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578403 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578406 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578409 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578412 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578414 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578417 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578419 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578422 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578424 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578427 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578429 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578432 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578434 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578437 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578440 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578442 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:56.581913 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:56.578445 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:56.582353 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.578450 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:56.582353 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.579178 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:51:56.582353 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.582242 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:51:56.583034 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.583020 2573 server.go:1019] "Starting client certificate rotation" Apr 17 07:51:56.583138 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.583118 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:56.583178 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.583161 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:56.606151 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.606128 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:56.609324 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.609303 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:56.625197 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.625172 2573 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:51:56.630773 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.630751 2573 log.go:25] "Validated CRI v1 image API" Apr 17 07:51:56.632020 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.632003 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:51:56.634246 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.634228 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:56.637264 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.637245 2573 fs.go:135] Filesystem UUIDs: map[5166067c-8562-4d58-85a4-38f8aa4a3f14:/dev/nvme0n1p4 78c13eb4-8a99-46b6-894d-f5c450228fc5:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 07:51:56.637339 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.637264 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:51:56.643799 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.643670 2573 manager.go:217] Machine: {Timestamp:2026-04-17 07:51:56.641964751 +0000 UTC m=+0.388021536 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092963 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a61a8b4fd0952187d7c19968acc7d SystemUUID:ec2a61a8-b4fd-0952-187d-7c19968acc7d BootID:ff3cab9e-9724-40c0-b171-94f7a17e49ff Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:07:b0:09:2e:3f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:07:b0:09:2e:3f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:41:4d:6e:9e:88 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:51:56.643799 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.643775 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:51:56.643930 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.643912 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:51:56.644954 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.644932 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:51:56.645094 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.644958 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-8.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:51:56.645139 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.645104 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:51:56.645139 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.645112 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:51:56.645139 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.645125 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:56.645857 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.645846 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:56.647368 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.647358 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:56.647488 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.647479 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:51:56.650025 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.650015 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:51:56.650062 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.650028 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:51:56.650062 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.650040 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:51:56.650062 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.650050 2573 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:51:56.650062 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.650059 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:51:56.651107 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.651095 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:56.651153 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.651114 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:56.653870 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.653839 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:51:56.655361 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.655347 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:51:56.656979 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.656966 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:51:56.657029 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.656983 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:51:56.657029 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.656990 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:51:56.657029 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.656996 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:51:56.657029 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657002 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:51:56.657029 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657009 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:51:56.657029 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657015 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:51:56.657029 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657021 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:51:56.657215 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657037 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:51:56.657215 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657044 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:51:56.657215 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657059 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:51:56.657215 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657068 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:51:56.657867 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657858 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:51:56.657907 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.657867 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:51:56.659343 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.659322 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kb6pf" Apr 17 07:51:56.661142 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.661122 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-8.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 07:51:56.661214 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.661132 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 07:51:56.661214 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.661167 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-8.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 07:51:56.661564 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.661553 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:51:56.661602 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.661587 2573 server.go:1295] "Started kubelet" Apr 17 07:51:56.661678 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.661655 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:51:56.661775 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.661737 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:51:56.661849 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.661811 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:51:56.662480 ip-10-0-137-8 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:51:56.664104 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.664091 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:51:56.665736 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.665716 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:51:56.667176 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.667157 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kb6pf" Apr 17 07:51:56.669692 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.668825 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-8.ec2.internal.18a71597153dc659 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-8.ec2.internal,UID:ip-10-0-137-8.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-8.ec2.internal,},FirstTimestamp:2026-04-17 07:51:56.661565017 +0000 UTC m=+0.407621802,LastTimestamp:2026-04-17 07:51:56.661565017 +0000 UTC m=+0.407621802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-8.ec2.internal,}" Apr 17 07:51:56.670874 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.670853 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 07:51:56.670958 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.670930 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:56.671449 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.671433 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:51:56.672231 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672123 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:51:56.672231 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672140 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:51:56.672356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672257 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:51:56.672356 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.672311 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:56.672356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672341 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:51:56.672356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672351 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:51:56.672637 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672441 2573 factory.go:153] Registering CRI-O factory Apr 17 07:51:56.672637 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672500 2573 factory.go:223] Registration of the crio container factory successfully Apr 17 07:51:56.672637 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672560 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:51:56.672637 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672568 2573 factory.go:55] Registering systemd factory Apr 17 07:51:56.672637 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672577 2573 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:51:56.672637 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672637 2573 factory.go:103] Registering Raw factory Apr 17 07:51:56.672946 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.672650 2573 manager.go:1196] Started watching for new ooms in manager Apr 17 07:51:56.673182 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.673079 2573 manager.go:319] Starting recovery of all containers Apr 17 07:51:56.676125 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.676103 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:56.679978 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.679954 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-8.ec2.internal\" not found" node="ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.684686 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.684670 2573 manager.go:324] Recovery completed Apr 17 07:51:56.688724 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.688710 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:56.691490 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.691474 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:56.691562 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.691500 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:56.691562 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.691510 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:56.692000 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.691988 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:51:56.692000 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.691998 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:51:56.692098 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.692016 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:56.694165 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.694152 2573 policy_none.go:49] "None policy: Start" Apr 17 07:51:56.694220 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.694168 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:51:56.694220 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.694179 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:51:56.741415 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.741383 2573 manager.go:341] "Starting Device Plugin manager" Apr 17 07:51:56.746495 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.741446 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:51:56.746495 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.741457 2573 server.go:85] "Starting device plugin registration server" Apr 17 07:51:56.746495 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.741694 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:51:56.746495 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.741706 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:51:56.746495 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.741805 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:51:56.746495 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.741883 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:51:56.746495 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.741891 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:51:56.746495 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.743965 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:51:56.746495 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.744155 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:56.796025 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.795984 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:51:56.797387 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.797369 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:51:56.797475 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.797396 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:51:56.797475 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.797415 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:51:56.797475 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.797421 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:51:56.797622 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.797498 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:51:56.801429 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.801413 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:56.842280 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.842224 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:56.843114 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.843099 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:56.843174 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.843128 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:56.843174 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.843146 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:56.843244 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.843180 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.849341 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.849328 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.849397 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.849349 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-8.ec2.internal\": node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:56.868690 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.868667 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:56.898091 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.898073 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal"] Apr 17 07:51:56.898144 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.898128 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:56.898888 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.898872 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:56.898936 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.898901 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:56.898936 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.898910 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:56.900056 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.900044 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:56.900175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.900163 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.900206 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.900189 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:56.900711 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.900696 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:56.900770 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.900725 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:56.900770 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.900735 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:56.900858 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.900699 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:56.900858 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.900819 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:56.900858 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.900832 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:56.901825 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.901810 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.901938 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.901835 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:56.902404 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.902389 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:56.902494 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.902416 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:56.902494 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.902428 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:56.925336 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.925313 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-8.ec2.internal\" not found" node="ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.928620 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.928605 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-8.ec2.internal\" not found" node="ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.968754 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:56.968737 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:56.974096 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.974083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f695ff27a724f87f07e7f9438b811560-config\") pod \"kube-apiserver-proxy-ip-10-0-137-8.ec2.internal\" (UID: \"f695ff27a724f87f07e7f9438b811560\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.974165 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.974104 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/427071f03a4d74acae0367dcd87643a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal\" (UID: \"427071f03a4d74acae0367dcd87643a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" Apr 17 07:51:56.974165 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:56.974126 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/427071f03a4d74acae0367dcd87643a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal\" (UID: \"427071f03a4d74acae0367dcd87643a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.069473 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.069449 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:57.074777 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.074758 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/427071f03a4d74acae0367dcd87643a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal\" (UID: \"427071f03a4d74acae0367dcd87643a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.074844 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.074799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f695ff27a724f87f07e7f9438b811560-config\") pod \"kube-apiserver-proxy-ip-10-0-137-8.ec2.internal\" (UID: \"f695ff27a724f87f07e7f9438b811560\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.074844 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.074824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/427071f03a4d74acae0367dcd87643a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal\" (UID: \"427071f03a4d74acae0367dcd87643a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.074907 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.074851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/427071f03a4d74acae0367dcd87643a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal\" (UID: \"427071f03a4d74acae0367dcd87643a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.074907 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.074864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/427071f03a4d74acae0367dcd87643a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal\" (UID: \"427071f03a4d74acae0367dcd87643a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.074907 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.074876 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f695ff27a724f87f07e7f9438b811560-config\") pod \"kube-apiserver-proxy-ip-10-0-137-8.ec2.internal\" (UID: \"f695ff27a724f87f07e7f9438b811560\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.170394 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.170324 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:57.226823 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.226784 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.232448 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.232431 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.271092 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.271068 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:57.371581 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.371542 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:57.472038 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.471957 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:57.572781 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.572379 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-8.ec2.internal\" not found" Apr 17 07:51:57.582759 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.582731 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:51:57.582952 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.582930 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:57.582993 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.582945 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:57.642302 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.642272 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:57.651169 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.651147 2573 apiserver.go:52] "Watching apiserver" Apr 17 07:51:57.658898 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:57.658860 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf695ff27a724f87f07e7f9438b811560.slice/crio-5a24eb16cc61a7376df9e4000ac5667485f9e1b318751aeaaf73db93eb1d79ca WatchSource:0}: Error finding container 5a24eb16cc61a7376df9e4000ac5667485f9e1b318751aeaaf73db93eb1d79ca: Status 404 returned error can't find the container with id 5a24eb16cc61a7376df9e4000ac5667485f9e1b318751aeaaf73db93eb1d79ca Apr 17 07:51:57.659156 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:57.659121 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod427071f03a4d74acae0367dcd87643a5.slice/crio-617461cef15da73f2a9b15efae68bf9c1ac15036df7ae92d5f9eddac734ce47a WatchSource:0}: Error finding container 617461cef15da73f2a9b15efae68bf9c1ac15036df7ae92d5f9eddac734ce47a: Status 404 returned error can't find the container with id 617461cef15da73f2a9b15efae68bf9c1ac15036df7ae92d5f9eddac734ce47a Apr 17 07:51:57.664204 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.664184 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:51:57.665766 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.665750 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:51:57.667683 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.667664 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr","openshift-dns/node-resolver-r5k8g","openshift-multus/multus-additional-cni-plugins-crv6m","openshift-multus/network-metrics-daemon-ht68l","openshift-ovn-kubernetes/ovnkube-node-f2vdv","openshift-cluster-node-tuning-operator/tuned-tqspd","openshift-image-registry/node-ca-kspjc","openshift-multus/multus-hvczg","openshift-network-diagnostics/network-check-target-scd9x","openshift-network-operator/iptables-alerter-dq722","kube-system/konnectivity-agent-dqw8q"] Apr 17 07:51:57.669856 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.669832 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:46:56 +0000 UTC" deadline="2027-09-20 20:37:00.978751925 +0000 UTC" Apr 17 07:51:57.669933 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.669855 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12516h45m3.308900223s" Apr 17 07:51:57.670968 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.670954 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.671046 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.671032 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:57.671700 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.671683 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.671883 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.671865 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.671967 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.671951 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.673067 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.673050 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:51:57.673158 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.673107 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:51:57.673398 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.673379 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:51:57.673398 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.673393 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:51:57.673665 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.673631 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:51:57.673738 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.673649 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qxnxl\"" Apr 17 07:51:57.674299 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.674281 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:51:57.674387 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.674351 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:51:57.674445 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.674386 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hsx9w\"" Apr 17 07:51:57.674525 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.674508 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.674573 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.674562 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:51:57.674719 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.674699 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pqf5d\"" Apr 17 07:51:57.674966 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.674811 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:51:57.674966 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.674899 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:51:57.675105 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.675079 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:51:57.675105 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.675096 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:51:57.675690 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.675676 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.677247 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.676863 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:51:57.677247 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.676976 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:51:57.677247 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.677025 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.677247 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.677055 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:51:57.677526 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.677499 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:51:57.677581 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.677540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:51:57.677632 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.677506 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-c9jv7\"" Apr 17 07:51:57.677906 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.677884 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4c7d0c52-01d6-4b13-b631-cd9e35e59fa6-hosts-file\") pod \"node-resolver-r5k8g\" (UID: \"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6\") " pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.677992 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.677920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-system-cni-dir\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.677992 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.677972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2f97\" (UniqueName: \"kubernetes.io/projected/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-kube-api-access-r2f97\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.678103 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.677989 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sdr4g\"" Apr 17 07:51:57.678103 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:51:57.678103 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-device-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.678103 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678059 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:57.678103 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678062 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-etc-selinux\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.678103 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678010 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5pxk\" (UniqueName: \"kubernetes.io/projected/7641578b-50bd-469d-ab99-7bcdcbb1d6db-kube-api-access-w5pxk\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c7d0c52-01d6-4b13-b631-cd9e35e59fa6-tmp-dir\") pod \"node-resolver-r5k8g\" (UID: \"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6\") " pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-cnibin\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-os-release\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678195 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2q7v\" (UniqueName: \"kubernetes.io/projected/341e9133-613e-45d4-bb0a-a187c93be340-kube-api-access-p2q7v\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-socket-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678268 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678291 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678315 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-sys-fs\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.678392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6mk6\" (UniqueName: \"kubernetes.io/projected/4c7d0c52-01d6-4b13-b631-cd9e35e59fa6-kube-api-access-l6mk6\") pod \"node-resolver-r5k8g\" (UID: \"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6\") " pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.679155 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-registration-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.679155 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678472 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.679155 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.678812 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:51:57.679760 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.679632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tmwwc\"" Apr 17 07:51:57.679760 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.679657 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:51:57.679760 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.679669 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:51:57.679760 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.679689 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:51:57.679989 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.679877 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.682575 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.681138 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cqlwn\"" Apr 17 07:51:57.682575 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.681333 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:51:57.682575 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.682027 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xmqmw\"" Apr 17 07:51:57.682772 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.682735 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:51:57.682894 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.682864 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:51:57.683119 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.683096 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:57.683296 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.683170 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:57.684558 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.684539 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:51:57.684639 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.684612 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:51:57.684961 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.684946 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mxk9b\"" Apr 17 07:51:57.685034 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.685012 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:51:57.685220 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.685204 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:51:57.685596 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.685576 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal"] Apr 17 07:51:57.686197 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.686180 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:57.686356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.686264 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal" Apr 17 07:51:57.687369 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.687349 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:57.694180 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.694167 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:57.694351 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.694335 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal"] Apr 17 07:51:57.701678 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.701661 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-v6q2r" Apr 17 07:51:57.708933 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.708914 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-v6q2r" Apr 17 07:51:57.749828 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.749723 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:57.773022 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.772997 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:51:57.778995 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.778968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-system-cni-dir\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.779143 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.778999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5pxk\" (UniqueName: \"kubernetes.io/projected/7641578b-50bd-469d-ab99-7bcdcbb1d6db-kube-api-access-w5pxk\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.779143 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-slash\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.779143 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-run-openvswitch\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.779143 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-system-cni-dir\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.779143 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-modprobe-d\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.779143 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5c58d588-382f-46d8-be38-9af05f699f8f-cni-binary-copy\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-multus-socket-dir-parent\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c7d0c52-01d6-4b13-b631-cd9e35e59fa6-tmp-dir\") pod \"node-resolver-r5k8g\" (UID: \"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6\") " pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-socket-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-kubernetes\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779298 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-run-k8s-cni-cncf-io\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-run-netns\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779402 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-sysctl-conf\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.779431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779425 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-multus-cni-dir\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779447 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-var-lib-kubelet\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779472 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmklz\" (UniqueName: \"kubernetes.io/projected/5c58d588-382f-46d8-be38-9af05f699f8f-kube-api-access-cmklz\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-registration-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-socket-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c7d0c52-01d6-4b13-b631-cd9e35e59fa6-tmp-dir\") pod \"node-resolver-r5k8g\" (UID: \"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6\") " pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779563 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-registration-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf2999c2-b9c3-4067-b076-2b30bde1888e-ovnkube-config\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-run-netns\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-etc-selinux\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779683 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-tuned\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-etc-kubernetes\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779715 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779753 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-etc-selinux\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.779850 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.779765 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779810 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b7452b81-2d05-443f-9a0b-287e9bb664d2-konnectivity-ca\") pod \"konnectivity-agent-dqw8q\" (UID: \"b7452b81-2d05-443f-9a0b-287e9bb664d2\") " pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.779873 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs podName:341e9133-613e-45d4-bb0a-a187c93be340 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:58.279847383 +0000 UTC m=+2.025904160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs") pod "network-metrics-daemon-ht68l" (UID: "341e9133-613e-45d4-bb0a-a187c93be340") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-kubelet\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779926 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779952 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krhrc\" (UniqueName: \"kubernetes.io/projected/cf2999c2-b9c3-4067-b076-2b30bde1888e-kube-api-access-krhrc\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779973 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-899hl\" (UniqueName: \"kubernetes.io/projected/d3454833-6f08-4cd5-9692-e10872c4ec39-kube-api-access-899hl\") pod \"node-ca-kspjc\" (UID: \"d3454833-6f08-4cd5-9692-e10872c4ec39\") " pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.779998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780021 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-var-lib-openvswitch\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780053 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-cni-bin\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3454833-6f08-4cd5-9692-e10872c4ec39-host\") pod \"node-ca-kspjc\" (UID: \"d3454833-6f08-4cd5-9692-e10872c4ec39\") " pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d3454833-6f08-4cd5-9692-e10872c4ec39-serviceca\") pod \"node-ca-kspjc\" (UID: \"d3454833-6f08-4cd5-9692-e10872c4ec39\") " pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brdbn\" (UniqueName: \"kubernetes.io/projected/3170e59e-44e4-4d0e-bc55-b0dfc511392e-kube-api-access-brdbn\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-os-release\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-cni-netd\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.780548 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-sysconfig\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3170e59e-44e4-4d0e-bc55-b0dfc511392e-tmp\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780249 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-multus-conf-dir\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780277 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-run-multus-certs\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4c7d0c52-01d6-4b13-b631-cd9e35e59fa6-hosts-file\") pod \"node-resolver-r5k8g\" (UID: \"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6\") " pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2f97\" (UniqueName: \"kubernetes.io/projected/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-kube-api-access-r2f97\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780328 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-run-systemd\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-etc-openvswitch\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4c7d0c52-01d6-4b13-b631-cd9e35e59fa6-hosts-file\") pod \"node-resolver-r5k8g\" (UID: \"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6\") " pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-node-log\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780379 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-log-socket\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780413 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-host\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780469 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-lib-modules\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-cnibin\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-os-release\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2q7v\" (UniqueName: \"kubernetes.io/projected/341e9133-613e-45d4-bb0a-a187c93be340-kube-api-access-p2q7v\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-systemd-units\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.781295 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-cnibin\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780600 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-os-release\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjckm\" (UniqueName: \"kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm\") pod \"network-check-target-scd9x\" (UID: \"dd860804-99b0-4bb4-9784-21b0e42ce760\") " pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780719 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-systemd\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-system-cni-dir\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780751 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n947h\" (UniqueName: \"kubernetes.io/projected/c7adda2d-0bbf-4c4c-94e2-436f9b360093-kube-api-access-n947h\") pod \"iptables-alerter-dq722\" (UID: \"c7adda2d-0bbf-4c4c-94e2-436f9b360093\") " pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5c58d588-382f-46d8-be38-9af05f699f8f-multus-daemon-config\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c7adda2d-0bbf-4c4c-94e2-436f9b360093-iptables-alerter-script\") pod \"iptables-alerter-dq722\" (UID: \"c7adda2d-0bbf-4c4c-94e2-436f9b360093\") " pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780824 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b7452b81-2d05-443f-9a0b-287e9bb664d2-agent-certs\") pod \"konnectivity-agent-dqw8q\" (UID: \"b7452b81-2d05-443f-9a0b-287e9bb664d2\") " pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-device-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780866 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-run-ovn\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-var-lib-kubelet\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-device-dir\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7adda2d-0bbf-4c4c-94e2-436f9b360093-host-slash\") pod \"iptables-alerter-dq722\" (UID: \"c7adda2d-0bbf-4c4c-94e2-436f9b360093\") " pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.781768 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf2999c2-b9c3-4067-b076-2b30bde1888e-ovnkube-script-lib\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-run\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.780969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-hostroot\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-sys-fs\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf2999c2-b9c3-4067-b076-2b30bde1888e-env-overrides\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf2999c2-b9c3-4067-b076-2b30bde1888e-ovn-node-metrics-cert\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-sys\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781074 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7641578b-50bd-469d-ab99-7bcdcbb1d6db-sys-fs\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-cnibin\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-var-lib-cni-multus\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6mk6\" (UniqueName: \"kubernetes.io/projected/4c7d0c52-01d6-4b13-b631-cd9e35e59fa6-kube-api-access-l6mk6\") pod \"node-resolver-r5k8g\" (UID: \"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6\") " pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-sysctl-d\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-var-lib-cni-bin\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.782254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.781191 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.795522 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.795498 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:51:57.798628 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.798607 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2f97\" (UniqueName: \"kubernetes.io/projected/b4eb62e2-ab98-4772-9149-6a8a3cd016b6-kube-api-access-r2f97\") pod \"multus-additional-cni-plugins-crv6m\" (UID: \"b4eb62e2-ab98-4772-9149-6a8a3cd016b6\") " pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:57.798730 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.798609 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2q7v\" (UniqueName: \"kubernetes.io/projected/341e9133-613e-45d4-bb0a-a187c93be340-kube-api-access-p2q7v\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:51:57.798730 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.798651 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6mk6\" (UniqueName: \"kubernetes.io/projected/4c7d0c52-01d6-4b13-b631-cd9e35e59fa6-kube-api-access-l6mk6\") pod \"node-resolver-r5k8g\" (UID: \"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6\") " pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.798730 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.798651 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5pxk\" (UniqueName: \"kubernetes.io/projected/7641578b-50bd-469d-ab99-7bcdcbb1d6db-kube-api-access-w5pxk\") pod \"aws-ebs-csi-driver-node-cn8xr\" (UID: \"7641578b-50bd-469d-ab99-7bcdcbb1d6db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.800027 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.799989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal" event={"ID":"f695ff27a724f87f07e7f9438b811560","Type":"ContainerStarted","Data":"5a24eb16cc61a7376df9e4000ac5667485f9e1b318751aeaaf73db93eb1d79ca"} Apr 17 07:51:57.800816 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.800781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" event={"ID":"427071f03a4d74acae0367dcd87643a5","Type":"ContainerStarted","Data":"617461cef15da73f2a9b15efae68bf9c1ac15036df7ae92d5f9eddac734ce47a"} Apr 17 07:51:57.882201 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-etc-kubernetes\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.882201 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b7452b81-2d05-443f-9a0b-287e9bb664d2-konnectivity-ca\") pod \"konnectivity-agent-dqw8q\" (UID: \"b7452b81-2d05-443f-9a0b-287e9bb664d2\") " pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-kubelet\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882234 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krhrc\" (UniqueName: \"kubernetes.io/projected/cf2999c2-b9c3-4067-b076-2b30bde1888e-kube-api-access-krhrc\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-899hl\" (UniqueName: \"kubernetes.io/projected/d3454833-6f08-4cd5-9692-e10872c4ec39-kube-api-access-899hl\") pod \"node-ca-kspjc\" (UID: \"d3454833-6f08-4cd5-9692-e10872c4ec39\") " pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-var-lib-openvswitch\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882300 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-etc-kubernetes\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-cni-bin\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882320 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3454833-6f08-4cd5-9692-e10872c4ec39-host\") pod \"node-ca-kspjc\" (UID: \"d3454833-6f08-4cd5-9692-e10872c4ec39\") " pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-var-lib-openvswitch\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882300 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-kubelet\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d3454833-6f08-4cd5-9692-e10872c4ec39-serviceca\") pod \"node-ca-kspjc\" (UID: \"d3454833-6f08-4cd5-9692-e10872c4ec39\") " pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brdbn\" (UniqueName: \"kubernetes.io/projected/3170e59e-44e4-4d0e-bc55-b0dfc511392e-kube-api-access-brdbn\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3454833-6f08-4cd5-9692-e10872c4ec39-host\") pod \"node-ca-kspjc\" (UID: \"d3454833-6f08-4cd5-9692-e10872c4ec39\") " pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882357 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-cni-bin\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.882409 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-os-release\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-cni-netd\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-sysconfig\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3170e59e-44e4-4d0e-bc55-b0dfc511392e-tmp\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-os-release\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-multus-conf-dir\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882508 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-cni-netd\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-run-multus-certs\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882543 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-sysconfig\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-run-systemd\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882581 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-etc-openvswitch\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-multus-conf-dir\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-run-systemd\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-run-multus-certs\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-node-log\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-etc-openvswitch\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-log-socket\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882665 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-log-socket\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-host\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-node-log\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-lib-modules\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-systemd-units\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-host\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjckm\" (UniqueName: \"kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm\") pod \"network-check-target-scd9x\" (UID: \"dd860804-99b0-4bb4-9784-21b0e42ce760\") " pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b7452b81-2d05-443f-9a0b-287e9bb664d2-konnectivity-ca\") pod \"konnectivity-agent-dqw8q\" (UID: \"b7452b81-2d05-443f-9a0b-287e9bb664d2\") " pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-systemd\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882831 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-systemd\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882779 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-systemd-units\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-system-cni-dir\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882902 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n947h\" (UniqueName: \"kubernetes.io/projected/c7adda2d-0bbf-4c4c-94e2-436f9b360093-kube-api-access-n947h\") pod \"iptables-alerter-dq722\" (UID: \"c7adda2d-0bbf-4c4c-94e2-436f9b360093\") " pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-system-cni-dir\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-lib-modules\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5c58d588-382f-46d8-be38-9af05f699f8f-multus-daemon-config\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c7adda2d-0bbf-4c4c-94e2-436f9b360093-iptables-alerter-script\") pod \"iptables-alerter-dq722\" (UID: \"c7adda2d-0bbf-4c4c-94e2-436f9b360093\") " pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.882994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b7452b81-2d05-443f-9a0b-287e9bb664d2-agent-certs\") pod \"konnectivity-agent-dqw8q\" (UID: \"b7452b81-2d05-443f-9a0b-287e9bb664d2\") " pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-run-ovn\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.883948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-var-lib-kubelet\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7adda2d-0bbf-4c4c-94e2-436f9b360093-host-slash\") pod \"iptables-alerter-dq722\" (UID: \"c7adda2d-0bbf-4c4c-94e2-436f9b360093\") " pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf2999c2-b9c3-4067-b076-2b30bde1888e-ovnkube-script-lib\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883099 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-run-ovn\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-run\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7adda2d-0bbf-4c4c-94e2-436f9b360093-host-slash\") pod \"iptables-alerter-dq722\" (UID: \"c7adda2d-0bbf-4c4c-94e2-436f9b360093\") " pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-hostroot\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-var-lib-kubelet\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883173 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-hostroot\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf2999c2-b9c3-4067-b076-2b30bde1888e-env-overrides\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf2999c2-b9c3-4067-b076-2b30bde1888e-ovn-node-metrics-cert\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-run\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-sys\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-cnibin\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-var-lib-cni-multus\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883321 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-sysctl-d\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-var-lib-cni-bin\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883368 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-slash\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.884680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-run-openvswitch\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-modprobe-d\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5c58d588-382f-46d8-be38-9af05f699f8f-cni-binary-copy\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-multus-socket-dir-parent\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883501 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-kubernetes\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883507 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d3454833-6f08-4cd5-9692-e10872c4ec39-serviceca\") pod \"node-ca-kspjc\" (UID: \"d3454833-6f08-4cd5-9692-e10872c4ec39\") " pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-run-k8s-cni-cncf-io\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-run-netns\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-var-lib-cni-bin\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c7adda2d-0bbf-4c4c-94e2-436f9b360093-iptables-alerter-script\") pod \"iptables-alerter-dq722\" (UID: \"c7adda2d-0bbf-4c4c-94e2-436f9b360093\") " pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-sysctl-conf\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-multus-cni-dir\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883658 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-var-lib-kubelet\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmklz\" (UniqueName: \"kubernetes.io/projected/5c58d588-382f-46d8-be38-9af05f699f8f-kube-api-access-cmklz\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883722 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-sysctl-conf\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf2999c2-b9c3-4067-b076-2b30bde1888e-ovnkube-config\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-run-netns\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885175 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883762 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf2999c2-b9c3-4067-b076-2b30bde1888e-ovnkube-script-lib\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-slash\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-tuned\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-run-k8s-cni-cncf-io\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-cnibin\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883909 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-kubernetes\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883935 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-run-netns\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883945 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5c58d588-382f-46d8-be38-9af05f699f8f-multus-daemon-config\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-sys\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-multus-cni-dir\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.883827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-run-openvswitch\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-var-lib-cni-multus\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf2999c2-b9c3-4067-b076-2b30bde1888e-env-overrides\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-sysctl-d\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-modprobe-d\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884147 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-run-netns\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884156 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2999c2-b9c3-4067-b076-2b30bde1888e-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-multus-socket-dir-parent\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.885644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c58d588-382f-46d8-be38-9af05f699f8f-host-var-lib-kubelet\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.886128 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5c58d588-382f-46d8-be38-9af05f699f8f-cni-binary-copy\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.886128 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.884474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf2999c2-b9c3-4067-b076-2b30bde1888e-ovnkube-config\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.886128 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.885104 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3170e59e-44e4-4d0e-bc55-b0dfc511392e-tmp\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.886128 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.885580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b7452b81-2d05-443f-9a0b-287e9bb664d2-agent-certs\") pod \"konnectivity-agent-dqw8q\" (UID: \"b7452b81-2d05-443f-9a0b-287e9bb664d2\") " pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:51:57.886128 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.885883 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3170e59e-44e4-4d0e-bc55-b0dfc511392e-etc-tuned\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.886308 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.886181 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf2999c2-b9c3-4067-b076-2b30bde1888e-ovn-node-metrics-cert\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.888280 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.888256 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:57.888280 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.888277 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:57.888280 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.888286 2573 projected.go:194] Error preparing data for projected volume kube-api-access-vjckm for pod openshift-network-diagnostics/network-check-target-scd9x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:57.888442 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:57.888333 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm podName:dd860804-99b0-4bb4-9784-21b0e42ce760 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:58.388320733 +0000 UTC m=+2.134377504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vjckm" (UniqueName: "kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm") pod "network-check-target-scd9x" (UID: "dd860804-99b0-4bb4-9784-21b0e42ce760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:57.890154 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.890128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n947h\" (UniqueName: \"kubernetes.io/projected/c7adda2d-0bbf-4c4c-94e2-436f9b360093-kube-api-access-n947h\") pod \"iptables-alerter-dq722\" (UID: \"c7adda2d-0bbf-4c4c-94e2-436f9b360093\") " pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:57.890245 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.890232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brdbn\" (UniqueName: \"kubernetes.io/projected/3170e59e-44e4-4d0e-bc55-b0dfc511392e-kube-api-access-brdbn\") pod \"tuned-tqspd\" (UID: \"3170e59e-44e4-4d0e-bc55-b0dfc511392e\") " pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:57.890851 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.890833 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-899hl\" (UniqueName: \"kubernetes.io/projected/d3454833-6f08-4cd5-9692-e10872c4ec39-kube-api-access-899hl\") pod \"node-ca-kspjc\" (UID: \"d3454833-6f08-4cd5-9692-e10872c4ec39\") " pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:57.891439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.891420 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmklz\" (UniqueName: \"kubernetes.io/projected/5c58d588-382f-46d8-be38-9af05f699f8f-kube-api-access-cmklz\") pod \"multus-hvczg\" (UID: \"5c58d588-382f-46d8-be38-9af05f699f8f\") " pod="openshift-multus/multus-hvczg" Apr 17 07:51:57.891508 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.891426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krhrc\" (UniqueName: \"kubernetes.io/projected/cf2999c2-b9c3-4067-b076-2b30bde1888e-kube-api-access-krhrc\") pod \"ovnkube-node-f2vdv\" (UID: \"cf2999c2-b9c3-4067-b076-2b30bde1888e\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:57.983493 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.983467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" Apr 17 07:51:57.989114 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.989090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5k8g" Apr 17 07:51:57.989608 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:57.989493 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7641578b_50bd_469d_ab99_7bcdcbb1d6db.slice/crio-46ea45fcd1645d816816e2f126e9c898048c3877bb44713188abacd843751c43 WatchSource:0}: Error finding container 46ea45fcd1645d816816e2f126e9c898048c3877bb44713188abacd843751c43: Status 404 returned error can't find the container with id 46ea45fcd1645d816816e2f126e9c898048c3877bb44713188abacd843751c43 Apr 17 07:51:57.996085 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:57.996062 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c7d0c52_01d6_4b13_b631_cd9e35e59fa6.slice/crio-46e89ec3da2cb969fcdd40e7a5ffd9cc1cc520d0e3e08bce352582bdfdf36f56 WatchSource:0}: Error finding container 46e89ec3da2cb969fcdd40e7a5ffd9cc1cc520d0e3e08bce352582bdfdf36f56: Status 404 returned error can't find the container with id 46e89ec3da2cb969fcdd40e7a5ffd9cc1cc520d0e3e08bce352582bdfdf36f56 Apr 17 07:51:57.996193 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:57.996104 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:51:58.000412 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.000346 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crv6m" Apr 17 07:51:58.003559 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:58.003531 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7452b81_2d05_443f_9a0b_287e9bb664d2.slice/crio-900f311e3f88dfbd87411d3811316b15a5c317bc698cf4c0f5f04d424a3932c0 WatchSource:0}: Error finding container 900f311e3f88dfbd87411d3811316b15a5c317bc698cf4c0f5f04d424a3932c0: Status 404 returned error can't find the container with id 900f311e3f88dfbd87411d3811316b15a5c317bc698cf4c0f5f04d424a3932c0 Apr 17 07:51:58.005311 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.005290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:51:58.011143 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.011120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tqspd" Apr 17 07:51:58.012085 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:58.012021 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4eb62e2_ab98_4772_9149_6a8a3cd016b6.slice/crio-3dd4aa37d0d603e4713d229ab3fffe81b96446322a8812d9150f92c40d492a7b WatchSource:0}: Error finding container 3dd4aa37d0d603e4713d229ab3fffe81b96446322a8812d9150f92c40d492a7b: Status 404 returned error can't find the container with id 3dd4aa37d0d603e4713d229ab3fffe81b96446322a8812d9150f92c40d492a7b Apr 17 07:51:58.014338 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:58.014313 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf2999c2_b9c3_4067_b076_2b30bde1888e.slice/crio-f43ecd3aa551a98a5363f57f7675c7bce981502575dc979627bdec43d28f2124 WatchSource:0}: Error finding container f43ecd3aa551a98a5363f57f7675c7bce981502575dc979627bdec43d28f2124: Status 404 returned error can't find the container with id f43ecd3aa551a98a5363f57f7675c7bce981502575dc979627bdec43d28f2124 Apr 17 07:51:58.015727 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.015705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kspjc" Apr 17 07:51:58.019543 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:58.019522 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3170e59e_44e4_4d0e_bc55_b0dfc511392e.slice/crio-d1dc9e4c3de2fa1683a4a5b64164b6144e1a970d35912200dd178ed6d8713538 WatchSource:0}: Error finding container d1dc9e4c3de2fa1683a4a5b64164b6144e1a970d35912200dd178ed6d8713538: Status 404 returned error can't find the container with id d1dc9e4c3de2fa1683a4a5b64164b6144e1a970d35912200dd178ed6d8713538 Apr 17 07:51:58.020685 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.020449 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hvczg" Apr 17 07:51:58.025210 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.025187 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dq722" Apr 17 07:51:58.025486 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:58.025463 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3454833_6f08_4cd5_9692_e10872c4ec39.slice/crio-10bf0897461218861a91327778d5278889ac8ec43688009640e5ced7ff2f0a68 WatchSource:0}: Error finding container 10bf0897461218861a91327778d5278889ac8ec43688009640e5ced7ff2f0a68: Status 404 returned error can't find the container with id 10bf0897461218861a91327778d5278889ac8ec43688009640e5ced7ff2f0a68 Apr 17 07:51:58.028949 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:58.028928 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c58d588_382f_46d8_be38_9af05f699f8f.slice/crio-0f7919e7f44c99a36035c8db697c0a15dd2a36ba96073b2c3e7f284a2873563d WatchSource:0}: Error finding container 0f7919e7f44c99a36035c8db697c0a15dd2a36ba96073b2c3e7f284a2873563d: Status 404 returned error can't find the container with id 0f7919e7f44c99a36035c8db697c0a15dd2a36ba96073b2c3e7f284a2873563d Apr 17 07:51:58.036496 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:51:58.036462 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7adda2d_0bbf_4c4c_94e2_436f9b360093.slice/crio-908df418e87c1556ffc18daefd2f342885f1cf8fda14cff525573cf70168db5b WatchSource:0}: Error finding container 908df418e87c1556ffc18daefd2f342885f1cf8fda14cff525573cf70168db5b: Status 404 returned error can't find the container with id 908df418e87c1556ffc18daefd2f342885f1cf8fda14cff525573cf70168db5b Apr 17 07:51:58.287663 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.287580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:51:58.287809 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:58.287752 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:58.287855 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:58.287817 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs podName:341e9133-613e-45d4-bb0a-a187c93be340 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:59.287800652 +0000 UTC m=+3.033857442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs") pod "network-metrics-daemon-ht68l" (UID: "341e9133-613e-45d4-bb0a-a187c93be340") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:58.489193 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.488592 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjckm\" (UniqueName: \"kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm\") pod \"network-check-target-scd9x\" (UID: \"dd860804-99b0-4bb4-9784-21b0e42ce760\") " pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:51:58.489193 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:58.488737 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:58.489193 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:58.488755 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:58.489193 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:58.488768 2573 projected.go:194] Error preparing data for projected volume kube-api-access-vjckm for pod openshift-network-diagnostics/network-check-target-scd9x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:58.489193 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:58.488844 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm podName:dd860804-99b0-4bb4-9784-21b0e42ce760 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:59.488824101 +0000 UTC m=+3.234880882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vjckm" (UniqueName: "kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm") pod "network-check-target-scd9x" (UID: "dd860804-99b0-4bb4-9784-21b0e42ce760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:58.709673 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.709543 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:57 +0000 UTC" deadline="2027-09-19 19:46:24.918331932 +0000 UTC" Apr 17 07:51:58.709673 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.709580 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12491h54m26.208757118s" Apr 17 07:51:58.817968 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.817380 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dq722" event={"ID":"c7adda2d-0bbf-4c4c-94e2-436f9b360093","Type":"ContainerStarted","Data":"908df418e87c1556ffc18daefd2f342885f1cf8fda14cff525573cf70168db5b"} Apr 17 07:51:58.817968 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.817517 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:51:58.817968 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:58.817616 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:51:58.823049 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.822998 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kspjc" event={"ID":"d3454833-6f08-4cd5-9692-e10872c4ec39","Type":"ContainerStarted","Data":"10bf0897461218861a91327778d5278889ac8ec43688009640e5ced7ff2f0a68"} Apr 17 07:51:58.831681 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.831638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tqspd" event={"ID":"3170e59e-44e4-4d0e-bc55-b0dfc511392e","Type":"ContainerStarted","Data":"d1dc9e4c3de2fa1683a4a5b64164b6144e1a970d35912200dd178ed6d8713538"} Apr 17 07:51:58.844431 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.844379 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5k8g" event={"ID":"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6","Type":"ContainerStarted","Data":"46e89ec3da2cb969fcdd40e7a5ffd9cc1cc520d0e3e08bce352582bdfdf36f56"} Apr 17 07:51:58.848423 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.848186 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hvczg" event={"ID":"5c58d588-382f-46d8-be38-9af05f699f8f","Type":"ContainerStarted","Data":"0f7919e7f44c99a36035c8db697c0a15dd2a36ba96073b2c3e7f284a2873563d"} Apr 17 07:51:58.851039 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.850892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"f43ecd3aa551a98a5363f57f7675c7bce981502575dc979627bdec43d28f2124"} Apr 17 07:51:58.853633 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.853609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crv6m" event={"ID":"b4eb62e2-ab98-4772-9149-6a8a3cd016b6","Type":"ContainerStarted","Data":"3dd4aa37d0d603e4713d229ab3fffe81b96446322a8812d9150f92c40d492a7b"} Apr 17 07:51:58.860267 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.860214 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dqw8q" event={"ID":"b7452b81-2d05-443f-9a0b-287e9bb664d2","Type":"ContainerStarted","Data":"900f311e3f88dfbd87411d3811316b15a5c317bc698cf4c0f5f04d424a3932c0"} Apr 17 07:51:58.866846 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:58.866770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" event={"ID":"7641578b-50bd-469d-ab99-7bcdcbb1d6db","Type":"ContainerStarted","Data":"46ea45fcd1645d816816e2f126e9c898048c3877bb44713188abacd843751c43"} Apr 17 07:51:59.062981 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:59.062904 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:59.073653 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:59.073625 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:59.294538 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:59.294502 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:51:59.294715 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:59.294637 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:59.294715 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:59.294703 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs podName:341e9133-613e-45d4-bb0a-a187c93be340 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:01.294683929 +0000 UTC m=+5.040740729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs") pod "network-metrics-daemon-ht68l" (UID: "341e9133-613e-45d4-bb0a-a187c93be340") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:59.495749 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:59.495662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjckm\" (UniqueName: \"kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm\") pod \"network-check-target-scd9x\" (UID: \"dd860804-99b0-4bb4-9784-21b0e42ce760\") " pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:51:59.495980 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:59.495878 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:59.495980 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:59.495939 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:59.495980 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:59.495966 2573 projected.go:194] Error preparing data for projected volume kube-api-access-vjckm for pod openshift-network-diagnostics/network-check-target-scd9x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:59.496158 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:59.496042 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm podName:dd860804-99b0-4bb4-9784-21b0e42ce760 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:01.496015779 +0000 UTC m=+5.242072552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vjckm" (UniqueName: "kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm") pod "network-check-target-scd9x" (UID: "dd860804-99b0-4bb4-9784-21b0e42ce760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:59.709818 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:59.709723 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:57 +0000 UTC" deadline="2027-12-22 19:02:07.014892315 +0000 UTC" Apr 17 07:51:59.709818 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:59.709763 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14747h10m7.305133069s" Apr 17 07:51:59.799166 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:51:59.798386 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:51:59.799166 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:51:59.798513 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:00.798733 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:00.798236 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:00.798733 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:00.798372 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:01.314245 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:01.314139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:01.314437 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:01.314288 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:01.314437 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:01.314366 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs podName:341e9133-613e-45d4-bb0a-a187c93be340 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:05.314343227 +0000 UTC m=+9.060400007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs") pod "network-metrics-daemon-ht68l" (UID: "341e9133-613e-45d4-bb0a-a187c93be340") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:01.516232 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:01.516195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjckm\" (UniqueName: \"kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm\") pod \"network-check-target-scd9x\" (UID: \"dd860804-99b0-4bb4-9784-21b0e42ce760\") " pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:01.516818 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:01.516406 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:01.516818 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:01.516437 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:01.516818 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:01.516450 2573 projected.go:194] Error preparing data for projected volume kube-api-access-vjckm for pod openshift-network-diagnostics/network-check-target-scd9x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:01.516818 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:01.516522 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm podName:dd860804-99b0-4bb4-9784-21b0e42ce760 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:05.516500793 +0000 UTC m=+9.262557590 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vjckm" (UniqueName: "kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm") pod "network-check-target-scd9x" (UID: "dd860804-99b0-4bb4-9784-21b0e42ce760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:01.798511 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:01.797951 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:01.798511 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:01.798095 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:02.798264 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:02.798232 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:02.798694 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:02.798369 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:03.797814 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:03.797616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:03.797814 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:03.797760 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:04.798181 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:04.798147 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:04.798599 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:04.798280 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:05.346342 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:05.346303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:05.346522 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:05.346462 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:05.346522 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:05.346517 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs podName:341e9133-613e-45d4-bb0a-a187c93be340 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:13.346501867 +0000 UTC m=+17.092558639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs") pod "network-metrics-daemon-ht68l" (UID: "341e9133-613e-45d4-bb0a-a187c93be340") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:05.547939 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:05.547903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjckm\" (UniqueName: \"kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm\") pod \"network-check-target-scd9x\" (UID: \"dd860804-99b0-4bb4-9784-21b0e42ce760\") " pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:05.548123 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:05.548058 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:05.548123 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:05.548083 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:05.548123 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:05.548095 2573 projected.go:194] Error preparing data for projected volume kube-api-access-vjckm for pod openshift-network-diagnostics/network-check-target-scd9x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:05.548275 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:05.548143 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm podName:dd860804-99b0-4bb4-9784-21b0e42ce760 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:13.548129709 +0000 UTC m=+17.294186484 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vjckm" (UniqueName: "kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm") pod "network-check-target-scd9x" (UID: "dd860804-99b0-4bb4-9784-21b0e42ce760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:05.798044 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:05.797965 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:05.798205 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:05.798110 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:06.798706 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:06.798664 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:06.799176 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:06.798818 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:07.797799 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:07.797745 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:07.797969 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:07.797899 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:08.797703 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:08.797672 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:08.798106 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:08.797768 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:09.798597 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:09.798566 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:09.799067 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:09.798689 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:09.934328 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:09.934297 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fw888"] Apr 17 07:52:09.937173 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:09.937152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:09.937307 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:09.937218 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:09.982284 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:09.982241 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/03f66286-e29a-494b-a307-9a269b5cd89f-dbus\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:09.982478 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:09.982316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/03f66286-e29a-494b-a307-9a269b5cd89f-kubelet-config\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:09.982478 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:09.982339 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:10.083585 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:10.083486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/03f66286-e29a-494b-a307-9a269b5cd89f-kubelet-config\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:10.083585 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:10.083532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:10.083760 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:10.083589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/03f66286-e29a-494b-a307-9a269b5cd89f-dbus\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:10.083760 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:10.083635 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/03f66286-e29a-494b-a307-9a269b5cd89f-kubelet-config\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:10.083760 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:10.083702 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/03f66286-e29a-494b-a307-9a269b5cd89f-dbus\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:10.083760 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:10.083709 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:10.083975 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:10.083802 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret podName:03f66286-e29a-494b-a307-9a269b5cd89f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.58376812 +0000 UTC m=+14.329824909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret") pod "global-pull-secret-syncer-fw888" (UID: "03f66286-e29a-494b-a307-9a269b5cd89f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:10.587587 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:10.587547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:10.587909 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:10.587700 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:10.587909 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:10.587779 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret podName:03f66286-e29a-494b-a307-9a269b5cd89f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:11.587763081 +0000 UTC m=+15.333819865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret") pod "global-pull-secret-syncer-fw888" (UID: "03f66286-e29a-494b-a307-9a269b5cd89f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:10.798303 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:10.798272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:10.798475 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:10.798396 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:11.595688 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:11.595643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:11.596149 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:11.595779 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:11.596149 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:11.595860 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret podName:03f66286-e29a-494b-a307-9a269b5cd89f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:13.595838918 +0000 UTC m=+17.341895707 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret") pod "global-pull-secret-syncer-fw888" (UID: "03f66286-e29a-494b-a307-9a269b5cd89f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:11.798069 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:11.798031 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:11.798221 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:11.798152 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:11.798276 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:11.798212 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:11.798347 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:11.798325 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:12.800580 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:12.800549 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:12.800978 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:12.800655 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:13.409178 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:13.409138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:13.409428 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.409303 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:13.409428 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.409378 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs podName:341e9133-613e-45d4-bb0a-a187c93be340 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:29.409362154 +0000 UTC m=+33.155418926 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs") pod "network-metrics-daemon-ht68l" (UID: "341e9133-613e-45d4-bb0a-a187c93be340") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:13.611185 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:13.611151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjckm\" (UniqueName: \"kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm\") pod \"network-check-target-scd9x\" (UID: \"dd860804-99b0-4bb4-9784-21b0e42ce760\") " pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:13.611378 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:13.611328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:13.611378 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.611350 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:13.611378 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.611376 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:13.611533 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.611390 2573 projected.go:194] Error preparing data for projected volume kube-api-access-vjckm for pod openshift-network-diagnostics/network-check-target-scd9x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:13.611533 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.611415 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:13.611533 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.611459 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm podName:dd860804-99b0-4bb4-9784-21b0e42ce760 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:29.611438647 +0000 UTC m=+33.357495436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vjckm" (UniqueName: "kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm") pod "network-check-target-scd9x" (UID: "dd860804-99b0-4bb4-9784-21b0e42ce760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:13.611533 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.611479 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret podName:03f66286-e29a-494b-a307-9a269b5cd89f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:17.611469311 +0000 UTC m=+21.357526089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret") pod "global-pull-secret-syncer-fw888" (UID: "03f66286-e29a-494b-a307-9a269b5cd89f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:13.797690 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:13.797602 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:13.797944 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:13.797602 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:13.797944 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.797759 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:13.797944 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:13.797849 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:14.800326 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:14.800297 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:14.800774 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:14.800415 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:15.798404 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.798150 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:15.798522 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.798261 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:15.798522 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:15.798458 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:15.798626 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:15.798555 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:15.902875 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.902833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal" event={"ID":"f695ff27a724f87f07e7f9438b811560","Type":"ContainerStarted","Data":"6377006db7dc7e5f478830f6516dd971d5a3d0065a9938ffcbaa901097c34c29"} Apr 17 07:52:15.904824 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.904749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hvczg" event={"ID":"5c58d588-382f-46d8-be38-9af05f699f8f","Type":"ContainerStarted","Data":"2e09f446b586ba60aaeef0e0d5fad30e2370f073ca2d51aa5793085f727586f1"} Apr 17 07:52:15.907964 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.907935 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"52fff7eb6258744a0ab187d5c1eedcc6ac8c398c97c11ed49a9326a5f58a5329"} Apr 17 07:52:15.908057 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.907972 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"11cd22483bcb08319635b214db9918c6af0586af078d103361c6f044bd7569eb"} Apr 17 07:52:15.908057 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.907988 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"336630a102a962bd02b3bb99a2a8014171464225d0fa5fdd93e7cf8c90f60d9e"} Apr 17 07:52:15.909643 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.909617 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tqspd" event={"ID":"3170e59e-44e4-4d0e-bc55-b0dfc511392e","Type":"ContainerStarted","Data":"88d33036bdad327b12ad2ff16a42311a74d25b4ac00812f18f88098badf408fc"} Apr 17 07:52:15.923504 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.923449 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-8.ec2.internal" podStartSLOduration=18.923436347 podStartE2EDuration="18.923436347s" podCreationTimestamp="2026-04-17 07:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:52:15.923058304 +0000 UTC m=+19.669115098" watchObservedRunningTime="2026-04-17 07:52:15.923436347 +0000 UTC m=+19.669493140" Apr 17 07:52:15.945171 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.944511 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tqspd" podStartSLOduration=2.572147298 podStartE2EDuration="19.944491061s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:51:58.021553151 +0000 UTC m=+1.767609939" lastFinishedPulling="2026-04-17 07:52:15.393896927 +0000 UTC m=+19.139953702" observedRunningTime="2026-04-17 07:52:15.943352985 +0000 UTC m=+19.689409781" watchObservedRunningTime="2026-04-17 07:52:15.944491061 +0000 UTC m=+19.690547856" Apr 17 07:52:15.967721 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:15.967678 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hvczg" podStartSLOduration=2.379085049 podStartE2EDuration="19.967663259s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:51:58.032080552 +0000 UTC m=+1.778137324" lastFinishedPulling="2026-04-17 07:52:15.620658757 +0000 UTC m=+19.366715534" observedRunningTime="2026-04-17 07:52:15.967043583 +0000 UTC m=+19.713100377" watchObservedRunningTime="2026-04-17 07:52:15.967663259 +0000 UTC m=+19.713720053" Apr 17 07:52:16.798515 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.798484 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:16.798686 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:16.798574 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:16.912014 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.911978 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kspjc" event={"ID":"d3454833-6f08-4cd5-9692-e10872c4ec39","Type":"ContainerStarted","Data":"bff49dab577e14c8f9b0d0a4fc71aaedfa9430c0fd1686adb75fc416bef4e953"} Apr 17 07:52:16.913855 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.913819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5k8g" event={"ID":"4c7d0c52-01d6-4b13-b631-cd9e35e59fa6","Type":"ContainerStarted","Data":"44dba147b09e830c73af64b3b90185ba79d2370d8df59732e593faca7f53da06"} Apr 17 07:52:16.915421 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.915300 2573 generic.go:358] "Generic (PLEG): container finished" podID="427071f03a4d74acae0367dcd87643a5" containerID="ef3a6e7f477198065ce44736d6c25de8a0820b56edf5c035c58759900fe78b39" exitCode=0 Apr 17 07:52:16.915421 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.915392 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" event={"ID":"427071f03a4d74acae0367dcd87643a5","Type":"ContainerDied","Data":"ef3a6e7f477198065ce44736d6c25de8a0820b56edf5c035c58759900fe78b39"} Apr 17 07:52:16.918246 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.918224 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 07:52:16.918573 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.918552 2573 generic.go:358] "Generic (PLEG): container finished" podID="cf2999c2-b9c3-4067-b076-2b30bde1888e" containerID="11cd22483bcb08319635b214db9918c6af0586af078d103361c6f044bd7569eb" exitCode=1 Apr 17 07:52:16.918663 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.918617 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerDied","Data":"11cd22483bcb08319635b214db9918c6af0586af078d103361c6f044bd7569eb"} Apr 17 07:52:16.918663 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.918647 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"5ac6724ff3b8a8456ac5019dbd366e7ecc5801864f6762c4cb763924a2da278a"} Apr 17 07:52:16.918663 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.918660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"ce1cd03da55c9ca727954304fcab8d6f50e26ff3571d9b3a92532d7ce1c9dd8d"} Apr 17 07:52:16.918809 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.918671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"fc36db50f4044501d4fdbb16e56a12ecac22a1a2b11e07fecc041da24c6a0782"} Apr 17 07:52:16.920228 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.920205 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4eb62e2-ab98-4772-9149-6a8a3cd016b6" containerID="06d976993c5fbfeb851267ed0da8a59009bdc3b01fdfbeade566e9a1c9000ce8" exitCode=0 Apr 17 07:52:16.920306 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.920251 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crv6m" event={"ID":"b4eb62e2-ab98-4772-9149-6a8a3cd016b6","Type":"ContainerDied","Data":"06d976993c5fbfeb851267ed0da8a59009bdc3b01fdfbeade566e9a1c9000ce8"} Apr 17 07:52:16.921841 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.921819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dqw8q" event={"ID":"b7452b81-2d05-443f-9a0b-287e9bb664d2","Type":"ContainerStarted","Data":"62a2a85566dabe750b3592a73fd501df98790d6d75b8fb60a4774f1e0cb6e0fc"} Apr 17 07:52:16.923227 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.923198 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" event={"ID":"7641578b-50bd-469d-ab99-7bcdcbb1d6db","Type":"ContainerStarted","Data":"4db121c158bdafc6f6d1ff59f8c640b3b29e98649085760bec72617fca250ffc"} Apr 17 07:52:16.924627 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.924605 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dq722" event={"ID":"c7adda2d-0bbf-4c4c-94e2-436f9b360093","Type":"ContainerStarted","Data":"70399ca957984481ee3522aa819d6a89b8d2b3fe48a97e72247fd9cc49481046"} Apr 17 07:52:16.928143 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.928108 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kspjc" podStartSLOduration=7.933200891 podStartE2EDuration="20.928097186s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:51:58.027401323 +0000 UTC m=+1.773458095" lastFinishedPulling="2026-04-17 07:52:11.022297617 +0000 UTC m=+14.768354390" observedRunningTime="2026-04-17 07:52:16.927430458 +0000 UTC m=+20.673487255" watchObservedRunningTime="2026-04-17 07:52:16.928097186 +0000 UTC m=+20.674153977" Apr 17 07:52:16.941128 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.940860 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dq722" podStartSLOduration=3.598666708 podStartE2EDuration="20.940843912s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:51:58.037940188 +0000 UTC m=+1.783996959" lastFinishedPulling="2026-04-17 07:52:15.380117383 +0000 UTC m=+19.126174163" observedRunningTime="2026-04-17 07:52:16.940094626 +0000 UTC m=+20.686151421" watchObservedRunningTime="2026-04-17 07:52:16.940843912 +0000 UTC m=+20.686900707" Apr 17 07:52:16.953140 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.953102 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dqw8q" podStartSLOduration=3.584254088 podStartE2EDuration="20.953090261s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:51:58.00598507 +0000 UTC m=+1.752041842" lastFinishedPulling="2026-04-17 07:52:15.37482123 +0000 UTC m=+19.120878015" observedRunningTime="2026-04-17 07:52:16.953083616 +0000 UTC m=+20.699140436" watchObservedRunningTime="2026-04-17 07:52:16.953090261 +0000 UTC m=+20.699147055" Apr 17 07:52:16.981066 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:16.981005 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r5k8g" podStartSLOduration=3.578707988 podStartE2EDuration="20.980988536s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:51:57.997890115 +0000 UTC m=+1.743946902" lastFinishedPulling="2026-04-17 07:52:15.400170664 +0000 UTC m=+19.146227450" observedRunningTime="2026-04-17 07:52:16.980612665 +0000 UTC m=+20.726669460" watchObservedRunningTime="2026-04-17 07:52:16.980988536 +0000 UTC m=+20.727045330" Apr 17 07:52:17.303288 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.303232 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:52:17.643054 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.643015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:17.643283 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:17.643228 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:17.643373 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:17.643297 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret podName:03f66286-e29a-494b-a307-9a269b5cd89f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:25.643280026 +0000 UTC m=+29.389336799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret") pod "global-pull-secret-syncer-fw888" (UID: "03f66286-e29a-494b-a307-9a269b5cd89f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:17.754818 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.754691 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:52:17.303253322Z","UUID":"cb499d26-22ca-4dce-8125-e686612fbee7","Handler":null,"Name":"","Endpoint":""} Apr 17 07:52:17.757356 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.757332 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:52:17.757471 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.757366 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:52:17.798596 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.798570 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:17.798733 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.798570 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:17.798733 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:17.798700 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:17.798848 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:17.798781 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:17.929567 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.929314 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" event={"ID":"427071f03a4d74acae0367dcd87643a5","Type":"ContainerStarted","Data":"4e0b6286d019bdd32761c121735e9727bdc531a369052a09db73bfb53d28a37e"} Apr 17 07:52:17.931320 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.931290 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" event={"ID":"7641578b-50bd-469d-ab99-7bcdcbb1d6db","Type":"ContainerStarted","Data":"a4a1580bca7fe7d3606d20a0073f44dd35c4c08a4bfabc27f9d514993fb9e82d"} Apr 17 07:52:17.944474 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:17.944401 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-8.ec2.internal" podStartSLOduration=20.944384583 podStartE2EDuration="20.944384583s" podCreationTimestamp="2026-04-17 07:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:52:17.943737716 +0000 UTC m=+21.689794514" watchObservedRunningTime="2026-04-17 07:52:17.944384583 +0000 UTC m=+21.690441378" Apr 17 07:52:18.797764 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:18.797730 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:18.797948 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:18.797874 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:18.936515 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:18.936481 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 07:52:18.937090 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:18.936885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"ff7ed4f9938a5cd15bbfe81c14992ce0c36e24cdbf10bde88199ce4af9369f7a"} Apr 17 07:52:18.938843 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:18.938814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" event={"ID":"7641578b-50bd-469d-ab99-7bcdcbb1d6db","Type":"ContainerStarted","Data":"f5a69a19a3e1f3f72cee464d47e57df9449dd63ff5ea5453d00f63794bf39009"} Apr 17 07:52:18.956599 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:18.956553 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cn8xr" podStartSLOduration=2.933994405 podStartE2EDuration="22.956539176s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:51:57.991894704 +0000 UTC m=+1.737951475" lastFinishedPulling="2026-04-17 07:52:18.014439474 +0000 UTC m=+21.760496246" observedRunningTime="2026-04-17 07:52:18.954865061 +0000 UTC m=+22.700921856" watchObservedRunningTime="2026-04-17 07:52:18.956539176 +0000 UTC m=+22.702596025" Apr 17 07:52:19.798514 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:19.798485 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:19.798690 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:19.798492 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:19.798690 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:19.798588 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:19.798775 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:19.798688 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:20.797762 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:20.797741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:20.798145 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:20.797845 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:20.945123 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:20.945098 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 07:52:20.945467 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:20.945431 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"daf58d6139114562ae324e5f325702f5f76731bdabb06f3897e4e7c48335bc79"} Apr 17 07:52:20.945742 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:20.945719 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:52:20.945855 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:20.945750 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:52:20.945946 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:20.945931 2573 scope.go:117] "RemoveContainer" containerID="11cd22483bcb08319635b214db9918c6af0586af078d103361c6f044bd7569eb" Apr 17 07:52:20.960314 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:20.960292 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:52:21.443767 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.443607 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:52:21.444256 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.444239 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:52:21.798395 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.798369 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:21.798763 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.798376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:21.798763 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:21.798489 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:21.798763 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:21.798527 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:21.950232 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.950206 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 07:52:21.950539 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.950511 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" event={"ID":"cf2999c2-b9c3-4067-b076-2b30bde1888e","Type":"ContainerStarted","Data":"a5c183c3f4538a46465ae7cccbf031de50b30d8ab1cd0105bc308eef1a9a61ce"} Apr 17 07:52:21.950728 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.950709 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:52:21.952881 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.952838 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4eb62e2-ab98-4772-9149-6a8a3cd016b6" containerID="8a245ddb74903b7ff58196e2b0c508f6dc04d1e3fa0df7bad5bc8c95a1fbe050" exitCode=0 Apr 17 07:52:21.953009 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.952941 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crv6m" event={"ID":"b4eb62e2-ab98-4772-9149-6a8a3cd016b6","Type":"ContainerDied","Data":"8a245ddb74903b7ff58196e2b0c508f6dc04d1e3fa0df7bad5bc8c95a1fbe050"} Apr 17 07:52:21.966325 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.966306 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:52:21.978877 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:21.978844 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" podStartSLOduration=8.321910053 podStartE2EDuration="25.978833234s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:51:58.016519042 +0000 UTC m=+1.762575815" lastFinishedPulling="2026-04-17 07:52:15.673442204 +0000 UTC m=+19.419498996" observedRunningTime="2026-04-17 07:52:21.978599546 +0000 UTC m=+25.724656339" watchObservedRunningTime="2026-04-17 07:52:21.978833234 +0000 UTC m=+25.724890027" Apr 17 07:52:22.776586 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:22.776552 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ht68l"] Apr 17 07:52:22.776779 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:22.776692 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:22.776877 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:22.776826 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:22.779610 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:22.779585 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fw888"] Apr 17 07:52:22.779728 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:22.779673 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:22.779778 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:22.779753 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:22.780410 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:22.780377 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-scd9x"] Apr 17 07:52:22.780525 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:22.780469 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:22.780585 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:22.780546 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:23.181596 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:23.181566 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:52:23.182260 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:23.181685 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 07:52:23.182260 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:23.182207 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dqw8q" Apr 17 07:52:23.962439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:23.962408 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4eb62e2-ab98-4772-9149-6a8a3cd016b6" containerID="53ffb9a55d430809fc1c92f5c6e11e877d7be986a92fe6d02d77c9a3d45eae36" exitCode=0 Apr 17 07:52:23.962624 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:23.962492 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crv6m" event={"ID":"b4eb62e2-ab98-4772-9149-6a8a3cd016b6","Type":"ContainerDied","Data":"53ffb9a55d430809fc1c92f5c6e11e877d7be986a92fe6d02d77c9a3d45eae36"} Apr 17 07:52:24.797731 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:24.797695 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:24.798190 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:24.797695 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:24.798190 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:24.797845 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:24.798190 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:24.797922 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:24.798190 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:24.797695 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:24.798190 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:24.798015 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:25.700863 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:25.700627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:25.701004 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:25.700775 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:25.701004 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:25.700935 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret podName:03f66286-e29a-494b-a307-9a269b5cd89f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:41.700916239 +0000 UTC m=+45.446973011 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret") pod "global-pull-secret-syncer-fw888" (UID: "03f66286-e29a-494b-a307-9a269b5cd89f") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:25.971174 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:25.971089 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4eb62e2-ab98-4772-9149-6a8a3cd016b6" containerID="3a9c1a5d6eeea2a2cd8907ac784c35175aea635120ca776170a725f8e026c715" exitCode=0 Apr 17 07:52:25.971174 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:25.971134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crv6m" event={"ID":"b4eb62e2-ab98-4772-9149-6a8a3cd016b6","Type":"ContainerDied","Data":"3a9c1a5d6eeea2a2cd8907ac784c35175aea635120ca776170a725f8e026c715"} Apr 17 07:52:26.799486 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:26.799450 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:26.799681 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:26.799669 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:26.799738 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:26.799705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:26.799816 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:26.799721 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:52:26.799816 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:26.799775 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-scd9x" podUID="dd860804-99b0-4bb4-9784-21b0e42ce760" Apr 17 07:52:26.799918 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:26.799863 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fw888" podUID="03f66286-e29a-494b-a307-9a269b5cd89f" Apr 17 07:52:27.570260 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.570169 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-8.ec2.internal" event="NodeReady" Apr 17 07:52:27.570681 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.570311 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:52:27.611681 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.611640 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jvx7c"] Apr 17 07:52:27.635527 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.635498 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cj7gw"] Apr 17 07:52:27.635702 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.635680 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.638307 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.638285 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:52:27.638444 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.638343 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xs2k6\"" Apr 17 07:52:27.638444 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.638344 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:52:27.651331 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.651304 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jvx7c"] Apr 17 07:52:27.651463 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.651338 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cj7gw"] Apr 17 07:52:27.651463 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.651439 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:27.653809 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.653775 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:52:27.653928 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.653870 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wmmwx\"" Apr 17 07:52:27.653928 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.653872 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:52:27.654038 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.653872 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:52:27.715449 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.715412 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46t7\" (UniqueName: \"kubernetes.io/projected/a31b36b1-77de-4517-8c23-566021eb1d32-kube-api-access-g46t7\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.715596 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.715458 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:27.715596 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.715494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a31b36b1-77de-4517-8c23-566021eb1d32-tmp-dir\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.715596 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.715515 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8dfp\" (UniqueName: \"kubernetes.io/projected/faa9121c-e579-414f-8d7d-77beba5608ea-kube-api-access-z8dfp\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:27.715596 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.715570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a31b36b1-77de-4517-8c23-566021eb1d32-config-volume\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.715596 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.715591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.816534 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.816495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a31b36b1-77de-4517-8c23-566021eb1d32-tmp-dir\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.816534 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.816533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8dfp\" (UniqueName: \"kubernetes.io/projected/faa9121c-e579-414f-8d7d-77beba5608ea-kube-api-access-z8dfp\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:27.816756 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.816606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a31b36b1-77de-4517-8c23-566021eb1d32-config-volume\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.816756 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.816623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.816756 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.816659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g46t7\" (UniqueName: \"kubernetes.io/projected/a31b36b1-77de-4517-8c23-566021eb1d32-kube-api-access-g46t7\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.816756 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.816689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:27.816981 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:27.816773 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:27.816981 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:27.816781 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:27.816981 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:27.816866 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls podName:a31b36b1-77de-4517-8c23-566021eb1d32 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:28.316844219 +0000 UTC m=+32.062900997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls") pod "dns-default-jvx7c" (UID: "a31b36b1-77de-4517-8c23-566021eb1d32") : secret "dns-default-metrics-tls" not found Apr 17 07:52:27.816981 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.816894 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a31b36b1-77de-4517-8c23-566021eb1d32-tmp-dir\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.816981 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:27.816910 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert podName:faa9121c-e579-414f-8d7d-77beba5608ea nodeName:}" failed. No retries permitted until 2026-04-17 07:52:28.316892821 +0000 UTC m=+32.062949605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert") pod "ingress-canary-cj7gw" (UID: "faa9121c-e579-414f-8d7d-77beba5608ea") : secret "canary-serving-cert" not found Apr 17 07:52:27.817269 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.817091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a31b36b1-77de-4517-8c23-566021eb1d32-config-volume\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.828287 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.828214 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46t7\" (UniqueName: \"kubernetes.io/projected/a31b36b1-77de-4517-8c23-566021eb1d32-kube-api-access-g46t7\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:27.828430 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:27.828316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8dfp\" (UniqueName: \"kubernetes.io/projected/faa9121c-e579-414f-8d7d-77beba5608ea-kube-api-access-z8dfp\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:28.320875 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.320830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:28.321092 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.320921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:28.321092 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:28.320999 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:28.321092 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:28.321024 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:28.321092 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:28.321079 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert podName:faa9121c-e579-414f-8d7d-77beba5608ea nodeName:}" failed. No retries permitted until 2026-04-17 07:52:29.321062477 +0000 UTC m=+33.067119263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert") pod "ingress-canary-cj7gw" (UID: "faa9121c-e579-414f-8d7d-77beba5608ea") : secret "canary-serving-cert" not found Apr 17 07:52:28.321300 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:28.321097 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls podName:a31b36b1-77de-4517-8c23-566021eb1d32 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:29.321087537 +0000 UTC m=+33.067144309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls") pod "dns-default-jvx7c" (UID: "a31b36b1-77de-4517-8c23-566021eb1d32") : secret "dns-default-metrics-tls" not found Apr 17 07:52:28.797693 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.797658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:28.798233 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.797658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:28.798233 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.797658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:28.800661 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.800634 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:28.801810 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.801776 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:28.802226 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.802102 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:52:28.802226 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.802117 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:28.802226 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.802147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z4m56\"" Apr 17 07:52:28.802226 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:28.802168 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ctqdx\"" Apr 17 07:52:29.329442 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:29.329402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:29.329641 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:29.329469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:29.329641 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:29.329556 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:29.329641 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:29.329628 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:29.329784 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:29.329636 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls podName:a31b36b1-77de-4517-8c23-566021eb1d32 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:31.3296178 +0000 UTC m=+35.075674577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls") pod "dns-default-jvx7c" (UID: "a31b36b1-77de-4517-8c23-566021eb1d32") : secret "dns-default-metrics-tls" not found Apr 17 07:52:29.329784 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:29.329696 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert podName:faa9121c-e579-414f-8d7d-77beba5608ea nodeName:}" failed. No retries permitted until 2026-04-17 07:52:31.329677031 +0000 UTC m=+35.075733805 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert") pod "ingress-canary-cj7gw" (UID: "faa9121c-e579-414f-8d7d-77beba5608ea") : secret "canary-serving-cert" not found Apr 17 07:52:29.430849 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:29.430806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:52:29.431027 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:29.430925 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:29.431027 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:29.431004 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs podName:341e9133-613e-45d4-bb0a-a187c93be340 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:01.430986478 +0000 UTC m=+65.177043250 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs") pod "network-metrics-daemon-ht68l" (UID: "341e9133-613e-45d4-bb0a-a187c93be340") : secret "metrics-daemon-secret" not found Apr 17 07:52:29.631912 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:29.631866 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjckm\" (UniqueName: \"kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm\") pod \"network-check-target-scd9x\" (UID: \"dd860804-99b0-4bb4-9784-21b0e42ce760\") " pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:29.635106 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:29.635076 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjckm\" (UniqueName: \"kubernetes.io/projected/dd860804-99b0-4bb4-9784-21b0e42ce760-kube-api-access-vjckm\") pod \"network-check-target-scd9x\" (UID: \"dd860804-99b0-4bb4-9784-21b0e42ce760\") " pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:29.737156 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:29.737113 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:31.345654 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:31.345612 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:31.346148 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:31.345698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:31.346148 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:31.345819 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:31.346148 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:31.345830 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:31.346148 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:31.345892 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert podName:faa9121c-e579-414f-8d7d-77beba5608ea nodeName:}" failed. No retries permitted until 2026-04-17 07:52:35.345871297 +0000 UTC m=+39.091928075 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert") pod "ingress-canary-cj7gw" (UID: "faa9121c-e579-414f-8d7d-77beba5608ea") : secret "canary-serving-cert" not found Apr 17 07:52:31.346148 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:31.345911 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls podName:a31b36b1-77de-4517-8c23-566021eb1d32 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:35.345902849 +0000 UTC m=+39.091959620 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls") pod "dns-default-jvx7c" (UID: "a31b36b1-77de-4517-8c23-566021eb1d32") : secret "dns-default-metrics-tls" not found Apr 17 07:52:31.632684 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:31.632657 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-scd9x"] Apr 17 07:52:31.637024 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:52:31.636994 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd860804_99b0_4bb4_9784_21b0e42ce760.slice/crio-708a8dd6ef5c12afee0342fdd3d35901224fe258d4d45aa8dabb1bbf252c0b3d WatchSource:0}: Error finding container 708a8dd6ef5c12afee0342fdd3d35901224fe258d4d45aa8dabb1bbf252c0b3d: Status 404 returned error can't find the container with id 708a8dd6ef5c12afee0342fdd3d35901224fe258d4d45aa8dabb1bbf252c0b3d Apr 17 07:52:31.983473 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:31.983444 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-scd9x" event={"ID":"dd860804-99b0-4bb4-9784-21b0e42ce760","Type":"ContainerStarted","Data":"708a8dd6ef5c12afee0342fdd3d35901224fe258d4d45aa8dabb1bbf252c0b3d"} Apr 17 07:52:32.988269 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:32.988237 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4eb62e2-ab98-4772-9149-6a8a3cd016b6" containerID="bb01397e3c1fa722a16bf25176c569328cf317ecfb4ce2af055234a1c3bacd53" exitCode=0 Apr 17 07:52:32.988991 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:32.988287 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crv6m" event={"ID":"b4eb62e2-ab98-4772-9149-6a8a3cd016b6","Type":"ContainerDied","Data":"bb01397e3c1fa722a16bf25176c569328cf317ecfb4ce2af055234a1c3bacd53"} Apr 17 07:52:33.993602 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:33.993566 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4eb62e2-ab98-4772-9149-6a8a3cd016b6" containerID="3c4aaaf3d3c93a8fd214bad3cd6acf3fa5d5042c55c0a60a19274359ec727192" exitCode=0 Apr 17 07:52:33.994156 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:33.993633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crv6m" event={"ID":"b4eb62e2-ab98-4772-9149-6a8a3cd016b6","Type":"ContainerDied","Data":"3c4aaaf3d3c93a8fd214bad3cd6acf3fa5d5042c55c0a60a19274359ec727192"} Apr 17 07:52:34.998941 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:34.998859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crv6m" event={"ID":"b4eb62e2-ab98-4772-9149-6a8a3cd016b6","Type":"ContainerStarted","Data":"cb9135555e43045dcd0cfb33173d58c1d2bb784dd8c132de53b2b8a9bb5832b3"} Apr 17 07:52:35.000183 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:35.000154 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-scd9x" event={"ID":"dd860804-99b0-4bb4-9784-21b0e42ce760","Type":"ContainerStarted","Data":"76ace822bffc501f9b6c19c15ff72317018e814e853d44f53db4dabf28197119"} Apr 17 07:52:35.000310 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:35.000293 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:52:35.020959 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:35.020913 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-crv6m" podStartSLOduration=5.182460963 podStartE2EDuration="39.020900417s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:51:58.013907107 +0000 UTC m=+1.759963881" lastFinishedPulling="2026-04-17 07:52:31.852346549 +0000 UTC m=+35.598403335" observedRunningTime="2026-04-17 07:52:35.019632019 +0000 UTC m=+38.765688814" watchObservedRunningTime="2026-04-17 07:52:35.020900417 +0000 UTC m=+38.766957211" Apr 17 07:52:35.037480 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:35.037442 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-scd9x" podStartSLOduration=34.932681803 podStartE2EDuration="38.037430331s" podCreationTimestamp="2026-04-17 07:51:57 +0000 UTC" firstStartedPulling="2026-04-17 07:52:31.638650394 +0000 UTC m=+35.384707168" lastFinishedPulling="2026-04-17 07:52:34.743398907 +0000 UTC m=+38.489455696" observedRunningTime="2026-04-17 07:52:35.037085064 +0000 UTC m=+38.783141853" watchObservedRunningTime="2026-04-17 07:52:35.037430331 +0000 UTC m=+38.783487154" Apr 17 07:52:35.383170 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:35.383140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:35.383320 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:35.383214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:35.383320 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:35.383295 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:35.383320 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:35.383297 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:35.383417 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:35.383349 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls podName:a31b36b1-77de-4517-8c23-566021eb1d32 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:43.383336069 +0000 UTC m=+47.129392841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls") pod "dns-default-jvx7c" (UID: "a31b36b1-77de-4517-8c23-566021eb1d32") : secret "dns-default-metrics-tls" not found Apr 17 07:52:35.383417 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:35.383361 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert podName:faa9121c-e579-414f-8d7d-77beba5608ea nodeName:}" failed. No retries permitted until 2026-04-17 07:52:43.383355487 +0000 UTC m=+47.129412258 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert") pod "ingress-canary-cj7gw" (UID: "faa9121c-e579-414f-8d7d-77beba5608ea") : secret "canary-serving-cert" not found Apr 17 07:52:41.729375 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:41.729329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:41.732517 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:41.732495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/03f66286-e29a-494b-a307-9a269b5cd89f-original-pull-secret\") pod \"global-pull-secret-syncer-fw888\" (UID: \"03f66286-e29a-494b-a307-9a269b5cd89f\") " pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:42.019433 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:42.019351 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fw888" Apr 17 07:52:42.131276 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:42.131247 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fw888"] Apr 17 07:52:42.135077 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:52:42.135050 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03f66286_e29a_494b_a307_9a269b5cd89f.slice/crio-11dfaa5186ef43f4f89addb3573d1b85a840b81e275ac85e8700cdba6742e8c6 WatchSource:0}: Error finding container 11dfaa5186ef43f4f89addb3573d1b85a840b81e275ac85e8700cdba6742e8c6: Status 404 returned error can't find the container with id 11dfaa5186ef43f4f89addb3573d1b85a840b81e275ac85e8700cdba6742e8c6 Apr 17 07:52:43.015742 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:43.015691 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fw888" event={"ID":"03f66286-e29a-494b-a307-9a269b5cd89f","Type":"ContainerStarted","Data":"11dfaa5186ef43f4f89addb3573d1b85a840b81e275ac85e8700cdba6742e8c6"} Apr 17 07:52:43.444535 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:43.444491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:43.444691 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:43.444631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:43.444691 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:43.444634 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:43.444889 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:43.444710 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert podName:faa9121c-e579-414f-8d7d-77beba5608ea nodeName:}" failed. No retries permitted until 2026-04-17 07:52:59.444688184 +0000 UTC m=+63.190744963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert") pod "ingress-canary-cj7gw" (UID: "faa9121c-e579-414f-8d7d-77beba5608ea") : secret "canary-serving-cert" not found Apr 17 07:52:43.444889 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:43.444728 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:43.444889 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:43.444775 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls podName:a31b36b1-77de-4517-8c23-566021eb1d32 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:59.444760913 +0000 UTC m=+63.190817686 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls") pod "dns-default-jvx7c" (UID: "a31b36b1-77de-4517-8c23-566021eb1d32") : secret "dns-default-metrics-tls" not found Apr 17 07:52:45.001732 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.001697 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm"] Apr 17 07:52:45.005620 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.005596 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.008442 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.008387 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 07:52:45.008442 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.008401 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 07:52:45.008628 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.008390 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 07:52:45.009265 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.009242 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 07:52:45.017967 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.017939 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm"] Apr 17 07:52:45.054529 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.054502 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbgz2\" (UniqueName: \"kubernetes.io/projected/ffcc9a42-5724-4285-b6fa-f36cc6335ef6-kube-api-access-cbgz2\") pod \"klusterlet-addon-workmgr-587777859c-d5lsm\" (UID: \"ffcc9a42-5724-4285-b6fa-f36cc6335ef6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.054695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.054566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffcc9a42-5724-4285-b6fa-f36cc6335ef6-tmp\") pod \"klusterlet-addon-workmgr-587777859c-d5lsm\" (UID: \"ffcc9a42-5724-4285-b6fa-f36cc6335ef6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.054763 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.054687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ffcc9a42-5724-4285-b6fa-f36cc6335ef6-klusterlet-config\") pod \"klusterlet-addon-workmgr-587777859c-d5lsm\" (UID: \"ffcc9a42-5724-4285-b6fa-f36cc6335ef6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.155341 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.155299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ffcc9a42-5724-4285-b6fa-f36cc6335ef6-klusterlet-config\") pod \"klusterlet-addon-workmgr-587777859c-d5lsm\" (UID: \"ffcc9a42-5724-4285-b6fa-f36cc6335ef6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.155540 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.155373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbgz2\" (UniqueName: \"kubernetes.io/projected/ffcc9a42-5724-4285-b6fa-f36cc6335ef6-kube-api-access-cbgz2\") pod \"klusterlet-addon-workmgr-587777859c-d5lsm\" (UID: \"ffcc9a42-5724-4285-b6fa-f36cc6335ef6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.155540 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.155421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffcc9a42-5724-4285-b6fa-f36cc6335ef6-tmp\") pod \"klusterlet-addon-workmgr-587777859c-d5lsm\" (UID: \"ffcc9a42-5724-4285-b6fa-f36cc6335ef6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.155945 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.155913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffcc9a42-5724-4285-b6fa-f36cc6335ef6-tmp\") pod \"klusterlet-addon-workmgr-587777859c-d5lsm\" (UID: \"ffcc9a42-5724-4285-b6fa-f36cc6335ef6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.158481 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.158455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ffcc9a42-5724-4285-b6fa-f36cc6335ef6-klusterlet-config\") pod \"klusterlet-addon-workmgr-587777859c-d5lsm\" (UID: \"ffcc9a42-5724-4285-b6fa-f36cc6335ef6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.163861 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.163841 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbgz2\" (UniqueName: \"kubernetes.io/projected/ffcc9a42-5724-4285-b6fa-f36cc6335ef6-kube-api-access-cbgz2\") pod \"klusterlet-addon-workmgr-587777859c-d5lsm\" (UID: \"ffcc9a42-5724-4285-b6fa-f36cc6335ef6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.318165 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.318074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:45.800935 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:45.800902 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm"] Apr 17 07:52:45.805169 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:52:45.805135 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffcc9a42_5724_4285_b6fa_f36cc6335ef6.slice/crio-814d48a044aa0c3f2caa87ca8d904b3b5393a8e301ba376482aafeacd12bbca7 WatchSource:0}: Error finding container 814d48a044aa0c3f2caa87ca8d904b3b5393a8e301ba376482aafeacd12bbca7: Status 404 returned error can't find the container with id 814d48a044aa0c3f2caa87ca8d904b3b5393a8e301ba376482aafeacd12bbca7 Apr 17 07:52:46.024672 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:46.024591 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" event={"ID":"ffcc9a42-5724-4285-b6fa-f36cc6335ef6","Type":"ContainerStarted","Data":"814d48a044aa0c3f2caa87ca8d904b3b5393a8e301ba376482aafeacd12bbca7"} Apr 17 07:52:46.025805 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:46.025764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fw888" event={"ID":"03f66286-e29a-494b-a307-9a269b5cd89f","Type":"ContainerStarted","Data":"5c51b997ea93e3807e0fcde18020416035cd40fae1c07ee050ff36d2ad9a62b3"} Apr 17 07:52:46.040689 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:46.040632 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fw888" podStartSLOduration=33.427527503 podStartE2EDuration="37.040617425s" podCreationTimestamp="2026-04-17 07:52:09 +0000 UTC" firstStartedPulling="2026-04-17 07:52:42.136671788 +0000 UTC m=+45.882728560" lastFinishedPulling="2026-04-17 07:52:45.749761695 +0000 UTC m=+49.495818482" observedRunningTime="2026-04-17 07:52:46.040535995 +0000 UTC m=+49.786592789" watchObservedRunningTime="2026-04-17 07:52:46.040617425 +0000 UTC m=+49.786674219" Apr 17 07:52:50.033865 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:50.033826 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" event={"ID":"ffcc9a42-5724-4285-b6fa-f36cc6335ef6","Type":"ContainerStarted","Data":"6b2fc11a10823d982a289bed7492dadae730f82eb7c44a14d083fd1797548acb"} Apr 17 07:52:50.034283 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:50.034071 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:50.035589 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:50.035566 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" Apr 17 07:52:50.080258 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:50.080206 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-587777859c-d5lsm" podStartSLOduration=2.855846107 podStartE2EDuration="6.08018898s" podCreationTimestamp="2026-04-17 07:52:44 +0000 UTC" firstStartedPulling="2026-04-17 07:52:45.807756798 +0000 UTC m=+49.553813576" lastFinishedPulling="2026-04-17 07:52:49.032099667 +0000 UTC m=+52.778156449" observedRunningTime="2026-04-17 07:52:50.064844856 +0000 UTC m=+53.810901649" watchObservedRunningTime="2026-04-17 07:52:50.08018898 +0000 UTC m=+53.826245775" Apr 17 07:52:53.981513 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:53.981480 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2vdv" Apr 17 07:52:59.455387 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:59.455346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:52:59.455387 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:52:59.455393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:52:59.455907 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:59.455474 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:59.455907 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:59.455520 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:59.455907 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:59.455549 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert podName:faa9121c-e579-414f-8d7d-77beba5608ea nodeName:}" failed. No retries permitted until 2026-04-17 07:53:31.455535631 +0000 UTC m=+95.201592403 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert") pod "ingress-canary-cj7gw" (UID: "faa9121c-e579-414f-8d7d-77beba5608ea") : secret "canary-serving-cert" not found Apr 17 07:52:59.455907 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:52:59.455600 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls podName:a31b36b1-77de-4517-8c23-566021eb1d32 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:31.455578252 +0000 UTC m=+95.201635027 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls") pod "dns-default-jvx7c" (UID: "a31b36b1-77de-4517-8c23-566021eb1d32") : secret "dns-default-metrics-tls" not found Apr 17 07:53:01.468938 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:01.468897 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:53:01.469321 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:01.469050 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:53:01.469321 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:01.469130 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs podName:341e9133-613e-45d4-bb0a-a187c93be340 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:05.469113299 +0000 UTC m=+129.215170070 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs") pod "network-metrics-daemon-ht68l" (UID: "341e9133-613e-45d4-bb0a-a187c93be340") : secret "metrics-daemon-secret" not found Apr 17 07:53:06.004491 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:06.004460 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-scd9x" Apr 17 07:53:31.476015 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:31.475974 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:53:31.476591 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:31.476036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:53:31.476591 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:31.476116 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:31.476591 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:31.476120 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:31.476591 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:31.476179 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls podName:a31b36b1-77de-4517-8c23-566021eb1d32 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:35.476164384 +0000 UTC m=+159.222221156 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls") pod "dns-default-jvx7c" (UID: "a31b36b1-77de-4517-8c23-566021eb1d32") : secret "dns-default-metrics-tls" not found Apr 17 07:53:31.476591 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:31.476192 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert podName:faa9121c-e579-414f-8d7d-77beba5608ea nodeName:}" failed. No retries permitted until 2026-04-17 07:54:35.476186704 +0000 UTC m=+159.222243476 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert") pod "ingress-canary-cj7gw" (UID: "faa9121c-e579-414f-8d7d-77beba5608ea") : secret "canary-serving-cert" not found Apr 17 07:53:58.220762 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.220723 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk"] Apr 17 07:53:58.223566 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.223549 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.225806 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.225766 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 07:53:58.226104 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.226090 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:53:58.226280 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.226268 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:53:58.227447 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.227432 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2p2z5\"" Apr 17 07:53:58.228567 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.228542 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 07:53:58.238961 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.238937 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk"] Apr 17 07:53:58.313371 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.313339 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm"] Apr 17 07:53:58.316122 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.316105 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm" Apr 17 07:53:58.318438 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.318417 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-7zph9\"" Apr 17 07:53:58.326510 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.326487 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7"] Apr 17 07:53:58.329486 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.329468 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm"] Apr 17 07:53:58.329566 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.329556 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.331959 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.331938 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 07:53:58.332071 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.332017 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:58.332071 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.332060 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ltf8n\"" Apr 17 07:53:58.332182 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.332023 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 07:53:58.332576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.332562 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:58.339861 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.339843 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7"] Apr 17 07:53:58.357193 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.357170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.357283 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.357221 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/07cc794d-dfcf-4290-beb1-51ec803617e1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.357323 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.357278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfpmw\" (UniqueName: \"kubernetes.io/projected/07cc794d-dfcf-4290-beb1-51ec803617e1-kube-api-access-kfpmw\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.457596 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.457570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.457691 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.457614 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/07cc794d-dfcf-4290-beb1-51ec803617e1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.457691 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.457636 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfpmw\" (UniqueName: \"kubernetes.io/projected/07cc794d-dfcf-4290-beb1-51ec803617e1-kube-api-access-kfpmw\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.457763 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:58.457704 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:58.457763 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:58.457761 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls podName:07cc794d-dfcf-4290-beb1-51ec803617e1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:58.957744773 +0000 UTC m=+122.703801544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6c5tk" (UID: "07cc794d-dfcf-4290-beb1-51ec803617e1") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:58.457869 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.457756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1556503a-a6b4-41c1-a88a-9476a85f4420-serving-cert\") pod \"service-ca-operator-d6fc45fc5-smsf7\" (UID: \"1556503a-a6b4-41c1-a88a-9476a85f4420\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.457869 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.457844 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpxlk\" (UniqueName: \"kubernetes.io/projected/f5b9d306-bf9d-4e85-b6da-267a2fd97905-kube-api-access-rpxlk\") pod \"network-check-source-8894fc9bd-6ldqm\" (UID: \"f5b9d306-bf9d-4e85-b6da-267a2fd97905\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm" Apr 17 07:53:58.457929 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.457869 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjzc\" (UniqueName: \"kubernetes.io/projected/1556503a-a6b4-41c1-a88a-9476a85f4420-kube-api-access-9mjzc\") pod \"service-ca-operator-d6fc45fc5-smsf7\" (UID: \"1556503a-a6b4-41c1-a88a-9476a85f4420\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.457929 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.457909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1556503a-a6b4-41c1-a88a-9476a85f4420-config\") pod \"service-ca-operator-d6fc45fc5-smsf7\" (UID: \"1556503a-a6b4-41c1-a88a-9476a85f4420\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.458310 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.458290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/07cc794d-dfcf-4290-beb1-51ec803617e1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.470439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.470410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfpmw\" (UniqueName: \"kubernetes.io/projected/07cc794d-dfcf-4290-beb1-51ec803617e1-kube-api-access-kfpmw\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.559091 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.559027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1556503a-a6b4-41c1-a88a-9476a85f4420-serving-cert\") pod \"service-ca-operator-d6fc45fc5-smsf7\" (UID: \"1556503a-a6b4-41c1-a88a-9476a85f4420\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.559091 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.559076 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpxlk\" (UniqueName: \"kubernetes.io/projected/f5b9d306-bf9d-4e85-b6da-267a2fd97905-kube-api-access-rpxlk\") pod \"network-check-source-8894fc9bd-6ldqm\" (UID: \"f5b9d306-bf9d-4e85-b6da-267a2fd97905\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm" Apr 17 07:53:58.559237 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.559147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjzc\" (UniqueName: \"kubernetes.io/projected/1556503a-a6b4-41c1-a88a-9476a85f4420-kube-api-access-9mjzc\") pod \"service-ca-operator-d6fc45fc5-smsf7\" (UID: \"1556503a-a6b4-41c1-a88a-9476a85f4420\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.559237 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.559192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1556503a-a6b4-41c1-a88a-9476a85f4420-config\") pod \"service-ca-operator-d6fc45fc5-smsf7\" (UID: \"1556503a-a6b4-41c1-a88a-9476a85f4420\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.559639 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.559619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1556503a-a6b4-41c1-a88a-9476a85f4420-config\") pod \"service-ca-operator-d6fc45fc5-smsf7\" (UID: \"1556503a-a6b4-41c1-a88a-9476a85f4420\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.561252 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.561232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1556503a-a6b4-41c1-a88a-9476a85f4420-serving-cert\") pod \"service-ca-operator-d6fc45fc5-smsf7\" (UID: \"1556503a-a6b4-41c1-a88a-9476a85f4420\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.567808 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.567772 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjzc\" (UniqueName: \"kubernetes.io/projected/1556503a-a6b4-41c1-a88a-9476a85f4420-kube-api-access-9mjzc\") pod \"service-ca-operator-d6fc45fc5-smsf7\" (UID: \"1556503a-a6b4-41c1-a88a-9476a85f4420\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.568041 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.568023 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpxlk\" (UniqueName: \"kubernetes.io/projected/f5b9d306-bf9d-4e85-b6da-267a2fd97905-kube-api-access-rpxlk\") pod \"network-check-source-8894fc9bd-6ldqm\" (UID: \"f5b9d306-bf9d-4e85-b6da-267a2fd97905\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm" Apr 17 07:53:58.624047 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.624017 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm" Apr 17 07:53:58.637690 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.637668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" Apr 17 07:53:58.744566 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.744539 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm"] Apr 17 07:53:58.747200 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:53:58.747176 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b9d306_bf9d_4e85_b6da_267a2fd97905.slice/crio-e4ab1c1bf3b75df7d8a1a14e876bb2b182095dbda61c10ff4b0c2b4f7386660d WatchSource:0}: Error finding container e4ab1c1bf3b75df7d8a1a14e876bb2b182095dbda61c10ff4b0c2b4f7386660d: Status 404 returned error can't find the container with id e4ab1c1bf3b75df7d8a1a14e876bb2b182095dbda61c10ff4b0c2b4f7386660d Apr 17 07:53:58.758118 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.758094 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7"] Apr 17 07:53:58.761000 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:53:58.760978 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1556503a_a6b4_41c1_a88a_9476a85f4420.slice/crio-ab9b0a096f51ffac1411beb844191e5add62e58f112fb8df4f25176d702aca2f WatchSource:0}: Error finding container ab9b0a096f51ffac1411beb844191e5add62e58f112fb8df4f25176d702aca2f: Status 404 returned error can't find the container with id ab9b0a096f51ffac1411beb844191e5add62e58f112fb8df4f25176d702aca2f Apr 17 07:53:58.961827 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:58.961775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:58.961984 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:58.961926 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:58.962025 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:58.961993 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls podName:07cc794d-dfcf-4290-beb1-51ec803617e1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:59.961978302 +0000 UTC m=+123.708035079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6c5tk" (UID: "07cc794d-dfcf-4290-beb1-51ec803617e1") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:59.164712 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:59.164677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" event={"ID":"1556503a-a6b4-41c1-a88a-9476a85f4420","Type":"ContainerStarted","Data":"ab9b0a096f51ffac1411beb844191e5add62e58f112fb8df4f25176d702aca2f"} Apr 17 07:53:59.165953 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:59.165931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm" event={"ID":"f5b9d306-bf9d-4e85-b6da-267a2fd97905","Type":"ContainerStarted","Data":"72c0e0728a42e21e114d4eb6d267da517d14974ec050274c6d9f6698cf7d87f3"} Apr 17 07:53:59.165953 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:59.165954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm" event={"ID":"f5b9d306-bf9d-4e85-b6da-267a2fd97905","Type":"ContainerStarted","Data":"e4ab1c1bf3b75df7d8a1a14e876bb2b182095dbda61c10ff4b0c2b4f7386660d"} Apr 17 07:53:59.181706 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:59.181666 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6ldqm" podStartSLOduration=1.181654722 podStartE2EDuration="1.181654722s" podCreationTimestamp="2026-04-17 07:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:53:59.180468005 +0000 UTC m=+122.926524803" watchObservedRunningTime="2026-04-17 07:53:59.181654722 +0000 UTC m=+122.927711515" Apr 17 07:53:59.968460 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:53:59.968424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:53:59.968827 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:59.968561 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:53:59.968827 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:53:59.968618 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls podName:07cc794d-dfcf-4290-beb1-51ec803617e1 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:01.968603024 +0000 UTC m=+125.714659796 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6c5tk" (UID: "07cc794d-dfcf-4290-beb1-51ec803617e1") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:01.171856 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:01.171821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" event={"ID":"1556503a-a6b4-41c1-a88a-9476a85f4420","Type":"ContainerStarted","Data":"d680ea99ad9922cf8379eec77fdc8bf5220c63bea0bbb72b633585384f8b0f64"} Apr 17 07:54:01.187306 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:01.187252 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" podStartSLOduration=1.261072082 podStartE2EDuration="3.187239711s" podCreationTimestamp="2026-04-17 07:53:58 +0000 UTC" firstStartedPulling="2026-04-17 07:53:58.762821166 +0000 UTC m=+122.508877938" lastFinishedPulling="2026-04-17 07:54:00.688988783 +0000 UTC m=+124.435045567" observedRunningTime="2026-04-17 07:54:01.186679532 +0000 UTC m=+124.932736325" watchObservedRunningTime="2026-04-17 07:54:01.187239711 +0000 UTC m=+124.933296505" Apr 17 07:54:01.983821 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:01.983769 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:54:01.984036 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:01.983915 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:01.984036 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:01.983976 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls podName:07cc794d-dfcf-4290-beb1-51ec803617e1 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:05.98395951 +0000 UTC m=+129.730016299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6c5tk" (UID: "07cc794d-dfcf-4290-beb1-51ec803617e1") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:03.839247 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:03.839217 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq"] Apr 17 07:54:03.842245 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:03.842226 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq" Apr 17 07:54:03.845035 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:03.845013 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 07:54:03.845998 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:03.845981 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:03.846067 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:03.845984 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-zvdjw\"" Apr 17 07:54:03.851322 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:03.851301 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq"] Apr 17 07:54:03.995755 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:03.995733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mf8b\" (UniqueName: \"kubernetes.io/projected/35b0b385-72ff-4b20-9adc-aaed0ec7e41e-kube-api-access-4mf8b\") pod \"migrator-74bb7799d9-x4spq\" (UID: \"35b0b385-72ff-4b20-9adc-aaed0ec7e41e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq" Apr 17 07:54:04.096387 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.096328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mf8b\" (UniqueName: \"kubernetes.io/projected/35b0b385-72ff-4b20-9adc-aaed0ec7e41e-kube-api-access-4mf8b\") pod \"migrator-74bb7799d9-x4spq\" (UID: \"35b0b385-72ff-4b20-9adc-aaed0ec7e41e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq" Apr 17 07:54:04.104575 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.104552 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mf8b\" (UniqueName: \"kubernetes.io/projected/35b0b385-72ff-4b20-9adc-aaed0ec7e41e-kube-api-access-4mf8b\") pod \"migrator-74bb7799d9-x4spq\" (UID: \"35b0b385-72ff-4b20-9adc-aaed0ec7e41e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq" Apr 17 07:54:04.151676 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.151653 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq" Apr 17 07:54:04.260994 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.260966 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq"] Apr 17 07:54:04.263676 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:04.263650 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35b0b385_72ff_4b20_9adc_aaed0ec7e41e.slice/crio-dbfa321854274acb04d605755196390e483294121ea1cd7893d7069058b3d9d5 WatchSource:0}: Error finding container dbfa321854274acb04d605755196390e483294121ea1cd7893d7069058b3d9d5: Status 404 returned error can't find the container with id dbfa321854274acb04d605755196390e483294121ea1cd7893d7069058b3d9d5 Apr 17 07:54:04.572812 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.572756 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kxkqh"] Apr 17 07:54:04.577245 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.577223 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.579980 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.579759 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 07:54:04.579980 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.579826 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 07:54:04.579980 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.579900 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 07:54:04.581023 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.581005 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 07:54:04.581166 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.581091 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-rs2tk\"" Apr 17 07:54:04.582180 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.582159 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kxkqh"] Apr 17 07:54:04.700247 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.700218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5acbc1c7-6c54-4457-999f-06e6688cf11f-signing-cabundle\") pod \"service-ca-865cb79987-kxkqh\" (UID: \"5acbc1c7-6c54-4457-999f-06e6688cf11f\") " pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.700380 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.700304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5acbc1c7-6c54-4457-999f-06e6688cf11f-signing-key\") pod \"service-ca-865cb79987-kxkqh\" (UID: \"5acbc1c7-6c54-4457-999f-06e6688cf11f\") " pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.700380 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.700350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8z86\" (UniqueName: \"kubernetes.io/projected/5acbc1c7-6c54-4457-999f-06e6688cf11f-kube-api-access-c8z86\") pod \"service-ca-865cb79987-kxkqh\" (UID: \"5acbc1c7-6c54-4457-999f-06e6688cf11f\") " pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.800868 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.800834 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8z86\" (UniqueName: \"kubernetes.io/projected/5acbc1c7-6c54-4457-999f-06e6688cf11f-kube-api-access-c8z86\") pod \"service-ca-865cb79987-kxkqh\" (UID: \"5acbc1c7-6c54-4457-999f-06e6688cf11f\") " pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.801031 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.800921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5acbc1c7-6c54-4457-999f-06e6688cf11f-signing-cabundle\") pod \"service-ca-865cb79987-kxkqh\" (UID: \"5acbc1c7-6c54-4457-999f-06e6688cf11f\") " pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.801031 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.800976 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5acbc1c7-6c54-4457-999f-06e6688cf11f-signing-key\") pod \"service-ca-865cb79987-kxkqh\" (UID: \"5acbc1c7-6c54-4457-999f-06e6688cf11f\") " pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.801677 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.801653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5acbc1c7-6c54-4457-999f-06e6688cf11f-signing-cabundle\") pod \"service-ca-865cb79987-kxkqh\" (UID: \"5acbc1c7-6c54-4457-999f-06e6688cf11f\") " pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.803414 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.803386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5acbc1c7-6c54-4457-999f-06e6688cf11f-signing-key\") pod \"service-ca-865cb79987-kxkqh\" (UID: \"5acbc1c7-6c54-4457-999f-06e6688cf11f\") " pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.809579 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.809557 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8z86\" (UniqueName: \"kubernetes.io/projected/5acbc1c7-6c54-4457-999f-06e6688cf11f-kube-api-access-c8z86\") pod \"service-ca-865cb79987-kxkqh\" (UID: \"5acbc1c7-6c54-4457-999f-06e6688cf11f\") " pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:04.887114 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:04.887082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kxkqh" Apr 17 07:54:05.007421 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:05.007390 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kxkqh"] Apr 17 07:54:05.057981 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:05.057952 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r5k8g_4c7d0c52-01d6-4b13-b631-cd9e35e59fa6/dns-node-resolver/0.log" Apr 17 07:54:05.184743 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:05.184651 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq" event={"ID":"35b0b385-72ff-4b20-9adc-aaed0ec7e41e","Type":"ContainerStarted","Data":"dbfa321854274acb04d605755196390e483294121ea1cd7893d7069058b3d9d5"} Apr 17 07:54:05.199215 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:05.199187 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5acbc1c7_6c54_4457_999f_06e6688cf11f.slice/crio-010e352907a88ffdc13a70bc4babe8f0bc0b55a3980d208f973736a48b409b92 WatchSource:0}: Error finding container 010e352907a88ffdc13a70bc4babe8f0bc0b55a3980d208f973736a48b409b92: Status 404 returned error can't find the container with id 010e352907a88ffdc13a70bc4babe8f0bc0b55a3980d208f973736a48b409b92 Apr 17 07:54:05.507638 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:05.507561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:54:05.507820 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:05.507734 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:54:05.507896 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:05.507829 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs podName:341e9133-613e-45d4-bb0a-a187c93be340 nodeName:}" failed. No retries permitted until 2026-04-17 07:56:07.507808867 +0000 UTC m=+251.253865663 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs") pod "network-metrics-daemon-ht68l" (UID: "341e9133-613e-45d4-bb0a-a187c93be340") : secret "metrics-daemon-secret" not found Apr 17 07:54:05.858514 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:05.858427 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kspjc_d3454833-6f08-4cd5-9692-e10872c4ec39/node-ca/0.log" Apr 17 07:54:06.010913 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:06.010879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:54:06.011287 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:06.011014 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:06.011287 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:06.011076 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls podName:07cc794d-dfcf-4290-beb1-51ec803617e1 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:14.011061149 +0000 UTC m=+137.757117921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6c5tk" (UID: "07cc794d-dfcf-4290-beb1-51ec803617e1") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:06.192620 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:06.192579 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq" event={"ID":"35b0b385-72ff-4b20-9adc-aaed0ec7e41e","Type":"ContainerStarted","Data":"83638f407c7fa6cc704ed5695b125b07587d6f498c8c3de6dab031d868efd40d"} Apr 17 07:54:06.192620 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:06.192625 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq" event={"ID":"35b0b385-72ff-4b20-9adc-aaed0ec7e41e","Type":"ContainerStarted","Data":"45442f8039f62295af2e304606d21a9962288aa7b5cfa8e442c23fb0b987b96f"} Apr 17 07:54:06.193943 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:06.193917 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kxkqh" event={"ID":"5acbc1c7-6c54-4457-999f-06e6688cf11f","Type":"ContainerStarted","Data":"0d02f9f94d1881e5ac556d11d0c9ff301928e556e489879acf09a998fb8c3a2e"} Apr 17 07:54:06.194047 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:06.193948 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kxkqh" event={"ID":"5acbc1c7-6c54-4457-999f-06e6688cf11f","Type":"ContainerStarted","Data":"010e352907a88ffdc13a70bc4babe8f0bc0b55a3980d208f973736a48b409b92"} Apr 17 07:54:06.215950 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:06.215906 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-x4spq" podStartSLOduration=2.237895554 podStartE2EDuration="3.215892892s" podCreationTimestamp="2026-04-17 07:54:03 +0000 UTC" firstStartedPulling="2026-04-17 07:54:04.265414992 +0000 UTC m=+128.011471765" lastFinishedPulling="2026-04-17 07:54:05.243412314 +0000 UTC m=+128.989469103" observedRunningTime="2026-04-17 07:54:06.214823864 +0000 UTC m=+129.960880654" watchObservedRunningTime="2026-04-17 07:54:06.215892892 +0000 UTC m=+129.961949686" Apr 17 07:54:06.233186 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:06.233146 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-kxkqh" podStartSLOduration=2.233131407 podStartE2EDuration="2.233131407s" podCreationTimestamp="2026-04-17 07:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:06.23175194 +0000 UTC m=+129.977808740" watchObservedRunningTime="2026-04-17 07:54:06.233131407 +0000 UTC m=+129.979188195" Apr 17 07:54:14.075106 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:14.075072 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:54:14.075574 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:14.075261 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:14.075574 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:14.075353 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls podName:07cc794d-dfcf-4290-beb1-51ec803617e1 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:30.075332572 +0000 UTC m=+153.821389352 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6c5tk" (UID: "07cc794d-dfcf-4290-beb1-51ec803617e1") : secret "cluster-monitoring-operator-tls" not found Apr 17 07:54:23.907220 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.907186 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b"] Apr 17 07:54:23.909776 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.909750 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" Apr 17 07:54:23.910948 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.910925 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76667dbc66-tr8pb"] Apr 17 07:54:23.912254 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.912232 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-52vmd\"" Apr 17 07:54:23.912523 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.912507 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 07:54:23.912779 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.912763 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 07:54:23.912911 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.912896 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:23.916372 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.916353 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 07:54:23.916470 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.916372 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-njxn8\"" Apr 17 07:54:23.916470 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.916382 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 07:54:23.916470 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.916433 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 07:54:23.920659 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.920641 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 07:54:23.922953 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.922933 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b"] Apr 17 07:54:23.936696 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.936677 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76667dbc66-tr8pb"] Apr 17 07:54:23.948428 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948408 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-image-registry-private-configuration\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:23.948518 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-trusted-ca\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:23.948518 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9ab094e1-f37a-480c-ba6e-88c223afc6fb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lmh5b\" (UID: \"9ab094e1-f37a-480c-ba6e-88c223afc6fb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" Apr 17 07:54:23.948518 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948481 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab094e1-f37a-480c-ba6e-88c223afc6fb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmh5b\" (UID: \"9ab094e1-f37a-480c-ba6e-88c223afc6fb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" Apr 17 07:54:23.948518 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948498 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-tls\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:23.948663 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948545 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ce99028a-4e20-42a4-83dd-4334e63fa45f-ca-trust-extracted\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:23.948663 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-certificates\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:23.948663 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-bound-sa-token\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:23.948753 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948684 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-installation-pull-secrets\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:23.948753 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.948735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2699\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-kube-api-access-b2699\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:23.984471 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.984451 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wshg8"] Apr 17 07:54:23.986488 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.986471 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:23.989775 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.989757 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:54:23.990496 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.990480 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:54:23.990724 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.990712 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s2zn8\"" Apr 17 07:54:23.991420 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.991404 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:54:23.993318 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:23.993301 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:54:24.009297 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.009275 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wshg8"] Apr 17 07:54:24.049276 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ce99028a-4e20-42a4-83dd-4334e63fa45f-ca-trust-extracted\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.049406 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-certificates\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.049406 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-bound-sa-token\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.049406 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049334 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.049406 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-installation-pull-secrets\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.049406 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049387 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-crio-socket\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.049406 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr7rx\" (UniqueName: \"kubernetes.io/projected/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-kube-api-access-kr7rx\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.049628 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049422 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-data-volume\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.049628 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049445 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.049628 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2699\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-kube-api-access-b2699\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.049732 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-image-registry-private-configuration\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.049732 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049678 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ce99028a-4e20-42a4-83dd-4334e63fa45f-ca-trust-extracted\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.049732 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049718 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-trusted-ca\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.049879 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9ab094e1-f37a-480c-ba6e-88c223afc6fb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lmh5b\" (UID: \"9ab094e1-f37a-480c-ba6e-88c223afc6fb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" Apr 17 07:54:24.049879 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049798 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab094e1-f37a-480c-ba6e-88c223afc6fb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmh5b\" (UID: \"9ab094e1-f37a-480c-ba6e-88c223afc6fb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" Apr 17 07:54:24.049879 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.049826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-tls\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.050230 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.050207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-certificates\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.050398 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.050378 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9ab094e1-f37a-480c-ba6e-88c223afc6fb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lmh5b\" (UID: \"9ab094e1-f37a-480c-ba6e-88c223afc6fb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" Apr 17 07:54:24.050607 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.050579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-trusted-ca\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.051975 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.051949 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-installation-pull-secrets\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.052060 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.052026 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-image-registry-private-configuration\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.052503 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.052484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab094e1-f37a-480c-ba6e-88c223afc6fb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lmh5b\" (UID: \"9ab094e1-f37a-480c-ba6e-88c223afc6fb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" Apr 17 07:54:24.052595 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.052575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-tls\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.061658 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.061634 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76667dbc66-tr8pb"] Apr 17 07:54:24.061862 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:24.061840 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token kube-api-access-b2699], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" podUID="ce99028a-4e20-42a4-83dd-4334e63fa45f" Apr 17 07:54:24.074230 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.074207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2699\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-kube-api-access-b2699\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.075367 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.075341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-bound-sa-token\") pod \"image-registry-76667dbc66-tr8pb\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.096220 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.096199 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-549bbc77ff-hvf4f"] Apr 17 07:54:24.098819 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.098805 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.112244 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.112222 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-549bbc77ff-hvf4f"] Apr 17 07:54:24.150492 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-registry-certificates\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.150595 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-crio-socket\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.150595 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150526 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-ca-trust-extracted\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.150595 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150544 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr7rx\" (UniqueName: \"kubernetes.io/projected/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-kube-api-access-kr7rx\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.150762 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-data-volume\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.150762 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-crio-socket\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.150762 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.150762 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-bound-sa-token\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.150762 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150738 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s5ql\" (UniqueName: \"kubernetes.io/projected/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-kube-api-access-9s5ql\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.151014 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-image-registry-private-configuration\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.151014 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-trusted-ca\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.151014 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.151014 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-data-volume\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.151014 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150911 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-registry-tls\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.151014 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.150957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-installation-pull-secrets\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.151241 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.151219 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.153036 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.153016 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.168219 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.168169 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr7rx\" (UniqueName: \"kubernetes.io/projected/b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f-kube-api-access-kr7rx\") pod \"insights-runtime-extractor-wshg8\" (UID: \"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f\") " pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.221067 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.221048 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" Apr 17 07:54:24.236695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.236673 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.241023 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.241008 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:24.252236 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.252217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-bound-sa-token\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.252344 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.252255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s5ql\" (UniqueName: \"kubernetes.io/projected/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-kube-api-access-9s5ql\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.252344 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.252285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-image-registry-private-configuration\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.252451 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.252353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-trusted-ca\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.252451 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.252423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-registry-tls\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.252549 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.252472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-installation-pull-secrets\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.252549 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.252512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-registry-certificates\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.252639 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.252556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-ca-trust-extracted\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.252992 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.252965 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-ca-trust-extracted\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.253492 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.253465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-registry-certificates\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.253583 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.253536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-trusted-ca\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.255307 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.255153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-image-registry-private-configuration\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.255307 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.255251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-installation-pull-secrets\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.255445 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.255332 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-registry-tls\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.264549 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.264438 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s5ql\" (UniqueName: \"kubernetes.io/projected/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-kube-api-access-9s5ql\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.264644 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.264567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd080088-4dd5-4cdc-94f9-ebe6ce802c67-bound-sa-token\") pod \"image-registry-549bbc77ff-hvf4f\" (UID: \"fd080088-4dd5-4cdc-94f9-ebe6ce802c67\") " pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.295053 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.295022 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wshg8" Apr 17 07:54:24.337915 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.337888 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b"] Apr 17 07:54:24.342424 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:24.342349 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab094e1_f37a_480c_ba6e_88c223afc6fb.slice/crio-777b55b7509eb4a6414cbf1db3cf09ccb0bf5e966ac2c9bc39c89535ab7639c9 WatchSource:0}: Error finding container 777b55b7509eb4a6414cbf1db3cf09ccb0bf5e966ac2c9bc39c89535ab7639c9: Status 404 returned error can't find the container with id 777b55b7509eb4a6414cbf1db3cf09ccb0bf5e966ac2c9bc39c89535ab7639c9 Apr 17 07:54:24.353840 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.353812 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-image-registry-private-configuration\") pod \"ce99028a-4e20-42a4-83dd-4334e63fa45f\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " Apr 17 07:54:24.353994 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.353869 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-installation-pull-secrets\") pod \"ce99028a-4e20-42a4-83dd-4334e63fa45f\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " Apr 17 07:54:24.353994 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.353899 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2699\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-kube-api-access-b2699\") pod \"ce99028a-4e20-42a4-83dd-4334e63fa45f\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " Apr 17 07:54:24.353994 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.353955 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ce99028a-4e20-42a4-83dd-4334e63fa45f-ca-trust-extracted\") pod \"ce99028a-4e20-42a4-83dd-4334e63fa45f\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " Apr 17 07:54:24.353994 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.353987 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-trusted-ca\") pod \"ce99028a-4e20-42a4-83dd-4334e63fa45f\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " Apr 17 07:54:24.354419 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.354013 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-tls\") pod \"ce99028a-4e20-42a4-83dd-4334e63fa45f\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " Apr 17 07:54:24.354419 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.354068 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-certificates\") pod \"ce99028a-4e20-42a4-83dd-4334e63fa45f\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " Apr 17 07:54:24.354419 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.354101 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-bound-sa-token\") pod \"ce99028a-4e20-42a4-83dd-4334e63fa45f\" (UID: \"ce99028a-4e20-42a4-83dd-4334e63fa45f\") " Apr 17 07:54:24.354663 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.354481 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce99028a-4e20-42a4-83dd-4334e63fa45f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ce99028a-4e20-42a4-83dd-4334e63fa45f" (UID: "ce99028a-4e20-42a4-83dd-4334e63fa45f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:54:24.355892 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.355854 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ce99028a-4e20-42a4-83dd-4334e63fa45f" (UID: "ce99028a-4e20-42a4-83dd-4334e63fa45f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:24.356216 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.356188 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ce99028a-4e20-42a4-83dd-4334e63fa45f" (UID: "ce99028a-4e20-42a4-83dd-4334e63fa45f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:24.356430 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.356407 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ce99028a-4e20-42a4-83dd-4334e63fa45f" (UID: "ce99028a-4e20-42a4-83dd-4334e63fa45f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:24.357093 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.357074 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ce99028a-4e20-42a4-83dd-4334e63fa45f" (UID: "ce99028a-4e20-42a4-83dd-4334e63fa45f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:24.357204 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.357170 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ce99028a-4e20-42a4-83dd-4334e63fa45f" (UID: "ce99028a-4e20-42a4-83dd-4334e63fa45f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:24.358653 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.358627 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ce99028a-4e20-42a4-83dd-4334e63fa45f" (UID: "ce99028a-4e20-42a4-83dd-4334e63fa45f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:24.358653 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.358640 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-kube-api-access-b2699" (OuterVolumeSpecName: "kube-api-access-b2699") pod "ce99028a-4e20-42a4-83dd-4334e63fa45f" (UID: "ce99028a-4e20-42a4-83dd-4334e63fa45f"). InnerVolumeSpecName "kube-api-access-b2699". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:24.407374 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.407347 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:24.412508 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.412488 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wshg8"] Apr 17 07:54:24.414879 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:24.414815 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb76c5e85_a5e6_4e58_9aa3_f2ffe3d27e1f.slice/crio-210600a3f6d4e54411fc83dd0fc21ff02da37fb865ab7b2adeaedc1649ec4b2e WatchSource:0}: Error finding container 210600a3f6d4e54411fc83dd0fc21ff02da37fb865ab7b2adeaedc1649ec4b2e: Status 404 returned error can't find the container with id 210600a3f6d4e54411fc83dd0fc21ff02da37fb865ab7b2adeaedc1649ec4b2e Apr 17 07:54:24.455576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.455425 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ce99028a-4e20-42a4-83dd-4334e63fa45f-ca-trust-extracted\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 07:54:24.455576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.455458 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-trusted-ca\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 07:54:24.455576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.455475 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-tls\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 07:54:24.455576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.455490 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ce99028a-4e20-42a4-83dd-4334e63fa45f-registry-certificates\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 07:54:24.455576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.455503 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-bound-sa-token\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 07:54:24.455576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.455518 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-image-registry-private-configuration\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 07:54:24.455576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.455536 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ce99028a-4e20-42a4-83dd-4334e63fa45f-installation-pull-secrets\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 07:54:24.455576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.455552 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2699\" (UniqueName: \"kubernetes.io/projected/ce99028a-4e20-42a4-83dd-4334e63fa45f-kube-api-access-b2699\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 07:54:24.549360 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:24.549276 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-549bbc77ff-hvf4f"] Apr 17 07:54:24.551554 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:24.551528 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd080088_4dd5_4cdc_94f9_ebe6ce802c67.slice/crio-49442ece125a2724fa49dbf33e8079c8869d9e7e4e3cdae389545a136d355d95 WatchSource:0}: Error finding container 49442ece125a2724fa49dbf33e8079c8869d9e7e4e3cdae389545a136d355d95: Status 404 returned error can't find the container with id 49442ece125a2724fa49dbf33e8079c8869d9e7e4e3cdae389545a136d355d95 Apr 17 07:54:25.243171 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.243138 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" event={"ID":"9ab094e1-f37a-480c-ba6e-88c223afc6fb","Type":"ContainerStarted","Data":"777b55b7509eb4a6414cbf1db3cf09ccb0bf5e966ac2c9bc39c89535ab7639c9"} Apr 17 07:54:25.244422 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.244397 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" event={"ID":"fd080088-4dd5-4cdc-94f9-ebe6ce802c67","Type":"ContainerStarted","Data":"1658d7c4ff7487b0648495a9f1df67810680a20c8cb9946be4b9f70e2a3418e4"} Apr 17 07:54:25.244525 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.244424 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" event={"ID":"fd080088-4dd5-4cdc-94f9-ebe6ce802c67","Type":"ContainerStarted","Data":"49442ece125a2724fa49dbf33e8079c8869d9e7e4e3cdae389545a136d355d95"} Apr 17 07:54:25.244525 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.244490 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:25.245656 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.245636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wshg8" event={"ID":"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f","Type":"ContainerStarted","Data":"c2258e685c26aa03ac268b906ee426a72b27d0436dc48c5a8735bea31e151b9d"} Apr 17 07:54:25.245656 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.245646 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76667dbc66-tr8pb" Apr 17 07:54:25.245774 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.245659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wshg8" event={"ID":"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f","Type":"ContainerStarted","Data":"210600a3f6d4e54411fc83dd0fc21ff02da37fb865ab7b2adeaedc1649ec4b2e"} Apr 17 07:54:25.283457 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.283409 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" podStartSLOduration=1.283390184 podStartE2EDuration="1.283390184s" podCreationTimestamp="2026-04-17 07:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:25.263297853 +0000 UTC m=+149.009354658" watchObservedRunningTime="2026-04-17 07:54:25.283390184 +0000 UTC m=+149.029446977" Apr 17 07:54:25.292739 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.292708 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76667dbc66-tr8pb"] Apr 17 07:54:25.295675 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:25.295647 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-76667dbc66-tr8pb"] Apr 17 07:54:26.249466 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:26.249425 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" event={"ID":"9ab094e1-f37a-480c-ba6e-88c223afc6fb","Type":"ContainerStarted","Data":"6e918851870bee18ce7bdb50be6784cd9e1d965a756eef223318e183bef67f2c"} Apr 17 07:54:26.251176 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:26.251134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wshg8" event={"ID":"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f","Type":"ContainerStarted","Data":"0348c106c021eb29cddff1517ded2f4eab6d454626a7166da3df98bab19da228"} Apr 17 07:54:26.270002 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:26.269939 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lmh5b" podStartSLOduration=2.313853923 podStartE2EDuration="3.269925436s" podCreationTimestamp="2026-04-17 07:54:23 +0000 UTC" firstStartedPulling="2026-04-17 07:54:24.344693073 +0000 UTC m=+148.090749861" lastFinishedPulling="2026-04-17 07:54:25.300764594 +0000 UTC m=+149.046821374" observedRunningTime="2026-04-17 07:54:26.26881831 +0000 UTC m=+150.014875104" watchObservedRunningTime="2026-04-17 07:54:26.269925436 +0000 UTC m=+150.015982240" Apr 17 07:54:26.802194 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:26.802161 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce99028a-4e20-42a4-83dd-4334e63fa45f" path="/var/lib/kubelet/pods/ce99028a-4e20-42a4-83dd-4334e63fa45f/volumes" Apr 17 07:54:27.255667 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:27.255631 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wshg8" event={"ID":"b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f","Type":"ContainerStarted","Data":"06e963b22073d73da00cb8cd9a3a2ba5b6fe8a36d7f8bf40a254fdbdad5b5c9c"} Apr 17 07:54:27.273454 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:27.273403 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wshg8" podStartSLOduration=2.161164548 podStartE2EDuration="4.273388194s" podCreationTimestamp="2026-04-17 07:54:23 +0000 UTC" firstStartedPulling="2026-04-17 07:54:24.474071102 +0000 UTC m=+148.220127879" lastFinishedPulling="2026-04-17 07:54:26.586294741 +0000 UTC m=+150.332351525" observedRunningTime="2026-04-17 07:54:27.272096835 +0000 UTC m=+151.018153630" watchObservedRunningTime="2026-04-17 07:54:27.273388194 +0000 UTC m=+151.019444986" Apr 17 07:54:30.098458 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:30.098418 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:54:30.100892 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:30.100867 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/07cc794d-dfcf-4290-beb1-51ec803617e1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6c5tk\" (UID: \"07cc794d-dfcf-4290-beb1-51ec803617e1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:54:30.331977 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:30.331930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" Apr 17 07:54:30.444766 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:30.444737 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk"] Apr 17 07:54:30.447852 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:30.447820 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07cc794d_dfcf_4290_beb1_51ec803617e1.slice/crio-e79c15b1909239d343538652d835d25068701e3583601e288a7bc6d99aa37539 WatchSource:0}: Error finding container e79c15b1909239d343538652d835d25068701e3583601e288a7bc6d99aa37539: Status 404 returned error can't find the container with id e79c15b1909239d343538652d835d25068701e3583601e288a7bc6d99aa37539 Apr 17 07:54:30.652068 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:30.651969 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jvx7c" podUID="a31b36b1-77de-4517-8c23-566021eb1d32" Apr 17 07:54:30.660192 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:30.660152 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cj7gw" podUID="faa9121c-e579-414f-8d7d-77beba5608ea" Apr 17 07:54:31.266854 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:31.266819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" event={"ID":"07cc794d-dfcf-4290-beb1-51ec803617e1","Type":"ContainerStarted","Data":"e79c15b1909239d343538652d835d25068701e3583601e288a7bc6d99aa37539"} Apr 17 07:54:31.266854 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:31.266841 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jvx7c" Apr 17 07:54:31.811266 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:31.811226 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ht68l" podUID="341e9133-613e-45d4-bb0a-a187c93be340" Apr 17 07:54:32.271191 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:32.271160 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" event={"ID":"07cc794d-dfcf-4290-beb1-51ec803617e1","Type":"ContainerStarted","Data":"5c92664977af527eb1ba2eb306622c8940b3989a8573fd8d6a5a3a00627785d2"} Apr 17 07:54:32.288099 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:32.288050 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6c5tk" podStartSLOduration=32.56662918 podStartE2EDuration="34.288037354s" podCreationTimestamp="2026-04-17 07:53:58 +0000 UTC" firstStartedPulling="2026-04-17 07:54:30.449778794 +0000 UTC m=+154.195835581" lastFinishedPulling="2026-04-17 07:54:32.171186984 +0000 UTC m=+155.917243755" observedRunningTime="2026-04-17 07:54:32.286823712 +0000 UTC m=+156.032880505" watchObservedRunningTime="2026-04-17 07:54:32.288037354 +0000 UTC m=+156.034094148" Apr 17 07:54:35.538422 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:35.538376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:54:35.538422 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:35.538438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:54:35.541004 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:35.540977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31b36b1-77de-4517-8c23-566021eb1d32-metrics-tls\") pod \"dns-default-jvx7c\" (UID: \"a31b36b1-77de-4517-8c23-566021eb1d32\") " pod="openshift-dns/dns-default-jvx7c" Apr 17 07:54:35.541111 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:35.541040 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa9121c-e579-414f-8d7d-77beba5608ea-cert\") pod \"ingress-canary-cj7gw\" (UID: \"faa9121c-e579-414f-8d7d-77beba5608ea\") " pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:54:35.770124 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:35.770091 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xs2k6\"" Apr 17 07:54:35.778082 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:35.778049 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jvx7c" Apr 17 07:54:35.911859 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:35.911826 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jvx7c"] Apr 17 07:54:36.282385 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:36.282340 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jvx7c" event={"ID":"a31b36b1-77de-4517-8c23-566021eb1d32","Type":"ContainerStarted","Data":"1f30209c63e6b58674300d857f6121d09c4bc3b4795fa941a38651be6ecc831e"} Apr 17 07:54:37.285959 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:37.285933 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jvx7c" event={"ID":"a31b36b1-77de-4517-8c23-566021eb1d32","Type":"ContainerStarted","Data":"b80732904df9434df8040c93bb5b0f628d1dd09762fa97b2e648c59065bc9305"} Apr 17 07:54:38.292874 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:38.292841 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jvx7c" event={"ID":"a31b36b1-77de-4517-8c23-566021eb1d32","Type":"ContainerStarted","Data":"3e7c0523524b6e70e435dc921263beabc36622dfa4473167b1e8b787e1213445"} Apr 17 07:54:38.293241 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:38.292952 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jvx7c" Apr 17 07:54:38.310512 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:38.310464 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jvx7c" podStartSLOduration=130.140163056 podStartE2EDuration="2m11.310450382s" podCreationTimestamp="2026-04-17 07:52:27 +0000 UTC" firstStartedPulling="2026-04-17 07:54:35.913616339 +0000 UTC m=+159.659673111" lastFinishedPulling="2026-04-17 07:54:37.083903652 +0000 UTC m=+160.829960437" observedRunningTime="2026-04-17 07:54:38.309776298 +0000 UTC m=+162.055833091" watchObservedRunningTime="2026-04-17 07:54:38.310450382 +0000 UTC m=+162.056507218" Apr 17 07:54:43.228354 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.228273 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-58jwp"] Apr 17 07:54:43.230727 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.230707 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.239046 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.238752 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-qw2v2\"" Apr 17 07:54:43.239046 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.238854 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 07:54:43.239046 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.238860 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 07:54:43.239046 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.238954 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 07:54:43.239046 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.239012 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:54:43.249453 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.249415 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-58jwp"] Apr 17 07:54:43.299567 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.299533 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t6pp\" (UniqueName: \"kubernetes.io/projected/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-api-access-7t6pp\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.299726 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.299657 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.299726 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.299702 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.299854 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.299730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55f08fcd-4e31-46fa-be87-935efefdb3d8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.299854 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.299783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.299949 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.299868 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/55f08fcd-4e31-46fa-be87-935efefdb3d8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.314765 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.314739 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tvlxz"] Apr 17 07:54:43.317232 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.317208 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.319653 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.319634 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:54:43.320061 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.320042 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:54:43.320257 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.320239 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:54:43.320352 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.320336 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-9c8dc\"" Apr 17 07:54:43.400512 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.400696 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae2e1e06-a904-48d8-85a3-5aacdc99b560-sys\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.400696 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-tls\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.400696 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae2e1e06-a904-48d8-85a3-5aacdc99b560-metrics-client-ca\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.400696 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ae2e1e06-a904-48d8-85a3-5aacdc99b560-root\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.400696 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.400959 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.400959 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400779 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-wtmp\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.400959 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400847 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-textfile\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.400959 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400881 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55f08fcd-4e31-46fa-be87-935efefdb3d8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.400959 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.400959 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400943 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-accelerators-collector-config\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.401232 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.400979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spxnt\" (UniqueName: \"kubernetes.io/projected/ae2e1e06-a904-48d8-85a3-5aacdc99b560-kube-api-access-spxnt\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.401232 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.401008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/55f08fcd-4e31-46fa-be87-935efefdb3d8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.401232 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.401037 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t6pp\" (UniqueName: \"kubernetes.io/projected/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-api-access-7t6pp\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.401437 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.401416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/55f08fcd-4e31-46fa-be87-935efefdb3d8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.401646 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.401625 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55f08fcd-4e31-46fa-be87-935efefdb3d8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.401760 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.401716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.403386 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.403363 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.403676 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.403657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.421099 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.421072 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t6pp\" (UniqueName: \"kubernetes.io/projected/55f08fcd-4e31-46fa-be87-935efefdb3d8-kube-api-access-7t6pp\") pod \"kube-state-metrics-69db897b98-58jwp\" (UID: \"55f08fcd-4e31-46fa-be87-935efefdb3d8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.502359 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-accelerators-collector-config\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502359 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502323 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spxnt\" (UniqueName: \"kubernetes.io/projected/ae2e1e06-a904-48d8-85a3-5aacdc99b560-kube-api-access-spxnt\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502577 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502363 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502577 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae2e1e06-a904-48d8-85a3-5aacdc99b560-sys\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502577 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-tls\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502577 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502496 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae2e1e06-a904-48d8-85a3-5aacdc99b560-sys\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502577 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae2e1e06-a904-48d8-85a3-5aacdc99b560-metrics-client-ca\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502834 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ae2e1e06-a904-48d8-85a3-5aacdc99b560-root\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502834 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-wtmp\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502834 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ae2e1e06-a904-48d8-85a3-5aacdc99b560-root\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.502834 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-textfile\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.503014 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.502902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-wtmp\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.503087 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.503052 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-accelerators-collector-config\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.503087 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.503072 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-textfile\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.503213 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.503179 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae2e1e06-a904-48d8-85a3-5aacdc99b560-metrics-client-ca\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.505130 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.505102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-tls\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.505242 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.505136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae2e1e06-a904-48d8-85a3-5aacdc99b560-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.524292 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.524268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spxnt\" (UniqueName: \"kubernetes.io/projected/ae2e1e06-a904-48d8-85a3-5aacdc99b560-kube-api-access-spxnt\") pod \"node-exporter-tvlxz\" (UID: \"ae2e1e06-a904-48d8-85a3-5aacdc99b560\") " pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.541805 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.541754 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" Apr 17 07:54:43.627355 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.627322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tvlxz" Apr 17 07:54:43.636369 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:43.636311 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2e1e06_a904_48d8_85a3_5aacdc99b560.slice/crio-acb391498b543cba040cb1bbf2fa8ba49eef016b99ff158ee6f2a9cf3932fb83 WatchSource:0}: Error finding container acb391498b543cba040cb1bbf2fa8ba49eef016b99ff158ee6f2a9cf3932fb83: Status 404 returned error can't find the container with id acb391498b543cba040cb1bbf2fa8ba49eef016b99ff158ee6f2a9cf3932fb83 Apr 17 07:54:43.679214 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:43.679180 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-58jwp"] Apr 17 07:54:43.682412 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:43.682385 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55f08fcd_4e31_46fa_be87_935efefdb3d8.slice/crio-d4a5178d5125c26452f1ca8c4d71a93d21638a43d316fb4e98deb3f94e3bf7de WatchSource:0}: Error finding container d4a5178d5125c26452f1ca8c4d71a93d21638a43d316fb4e98deb3f94e3bf7de: Status 404 returned error can't find the container with id d4a5178d5125c26452f1ca8c4d71a93d21638a43d316fb4e98deb3f94e3bf7de Apr 17 07:54:44.310803 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:44.310749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tvlxz" event={"ID":"ae2e1e06-a904-48d8-85a3-5aacdc99b560","Type":"ContainerStarted","Data":"acb391498b543cba040cb1bbf2fa8ba49eef016b99ff158ee6f2a9cf3932fb83"} Apr 17 07:54:44.312549 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:44.312520 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" event={"ID":"55f08fcd-4e31-46fa-be87-935efefdb3d8","Type":"ContainerStarted","Data":"d4a5178d5125c26452f1ca8c4d71a93d21638a43d316fb4e98deb3f94e3bf7de"} Apr 17 07:54:45.316439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.316404 2573 generic.go:358] "Generic (PLEG): container finished" podID="ae2e1e06-a904-48d8-85a3-5aacdc99b560" containerID="f7e9ab78644af15bb8cecc3be98161029529961c9c60d951d0e438e07dd8b839" exitCode=0 Apr 17 07:54:45.316874 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.316486 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tvlxz" event={"ID":"ae2e1e06-a904-48d8-85a3-5aacdc99b560","Type":"ContainerDied","Data":"f7e9ab78644af15bb8cecc3be98161029529961c9c60d951d0e438e07dd8b839"} Apr 17 07:54:45.318344 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.318316 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" event={"ID":"55f08fcd-4e31-46fa-be87-935efefdb3d8","Type":"ContainerStarted","Data":"a76be01718a15b1fc76a587976c55f257a09cf1c136f12a2f93ece7b41d38ce2"} Apr 17 07:54:45.318447 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.318351 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" event={"ID":"55f08fcd-4e31-46fa-be87-935efefdb3d8","Type":"ContainerStarted","Data":"2dec393681d404cdbc7065de5f5461826446a7ba42582a7387fdbfdc50f470a5"} Apr 17 07:54:45.318447 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.318364 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" event={"ID":"55f08fcd-4e31-46fa-be87-935efefdb3d8","Type":"ContainerStarted","Data":"874bae33a3465f8425d10ccb86bc7cc84b4f753cf03ac06618f6b4ff919fa466"} Apr 17 07:54:45.375917 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.375860 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-58jwp" podStartSLOduration=1.245988873 podStartE2EDuration="2.375845119s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:43.684452586 +0000 UTC m=+167.430509362" lastFinishedPulling="2026-04-17 07:54:44.814308833 +0000 UTC m=+168.560365608" observedRunningTime="2026-04-17 07:54:45.374716564 +0000 UTC m=+169.120773393" watchObservedRunningTime="2026-04-17 07:54:45.375845119 +0000 UTC m=+169.121901930" Apr 17 07:54:45.798035 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.798002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:54:45.800938 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.800919 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wmmwx\"" Apr 17 07:54:45.809103 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.809082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cj7gw" Apr 17 07:54:45.919428 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:45.919347 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cj7gw"] Apr 17 07:54:45.921624 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:45.921596 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa9121c_e579_414f_8d7d_77beba5608ea.slice/crio-4f617fc63c79d62a6542807655bf72cfdd886450655159b3d7c401a8d85aa55b WatchSource:0}: Error finding container 4f617fc63c79d62a6542807655bf72cfdd886450655159b3d7c401a8d85aa55b: Status 404 returned error can't find the container with id 4f617fc63c79d62a6542807655bf72cfdd886450655159b3d7c401a8d85aa55b Apr 17 07:54:46.256418 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:46.256393 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-549bbc77ff-hvf4f" Apr 17 07:54:46.325698 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:46.325009 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tvlxz" event={"ID":"ae2e1e06-a904-48d8-85a3-5aacdc99b560","Type":"ContainerStarted","Data":"d51004c4e95d16684a527853a78698aa3f5897733a0c04d9a7905811e4dd7ae7"} Apr 17 07:54:46.325698 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:46.325054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tvlxz" event={"ID":"ae2e1e06-a904-48d8-85a3-5aacdc99b560","Type":"ContainerStarted","Data":"1ea3b6c28acb6345cff1059c09b599965e4c168ce7a500e3501bd21bc30649b3"} Apr 17 07:54:46.326857 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:46.326831 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cj7gw" event={"ID":"faa9121c-e579-414f-8d7d-77beba5608ea","Type":"ContainerStarted","Data":"4f617fc63c79d62a6542807655bf72cfdd886450655159b3d7c401a8d85aa55b"} Apr 17 07:54:46.352103 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:46.352040 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tvlxz" podStartSLOduration=2.581774506 podStartE2EDuration="3.352018685s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:43.639054687 +0000 UTC m=+167.385111458" lastFinishedPulling="2026-04-17 07:54:44.40929886 +0000 UTC m=+168.155355637" observedRunningTime="2026-04-17 07:54:46.35165291 +0000 UTC m=+170.097709705" watchObservedRunningTime="2026-04-17 07:54:46.352018685 +0000 UTC m=+170.098075488" Apr 17 07:54:46.800537 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:46.800505 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:54:47.625992 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.625951 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-64978bc4d-47kr9"] Apr 17 07:54:47.629116 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.629099 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.633316 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.633290 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1trg39c333bk7\"" Apr 17 07:54:47.633451 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.633346 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-kjjmg\"" Apr 17 07:54:47.633451 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.633344 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 07:54:47.633451 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.633368 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 07:54:47.633451 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.633406 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 07:54:47.633451 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.633294 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 07:54:47.640317 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.640296 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64978bc4d-47kr9"] Apr 17 07:54:47.734979 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.734954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4d11a1ca-4f07-474f-a603-a2cf5a182233-secret-metrics-server-client-certs\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.735126 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.734997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9wv\" (UniqueName: \"kubernetes.io/projected/4d11a1ca-4f07-474f-a603-a2cf5a182233-kube-api-access-bh9wv\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.735126 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.735041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d11a1ca-4f07-474f-a603-a2cf5a182233-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.735126 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.735098 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4d11a1ca-4f07-474f-a603-a2cf5a182233-secret-metrics-server-tls\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.735243 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.735146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4d11a1ca-4f07-474f-a603-a2cf5a182233-audit-log\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.735243 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.735184 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4d11a1ca-4f07-474f-a603-a2cf5a182233-metrics-server-audit-profiles\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.735243 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.735200 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d11a1ca-4f07-474f-a603-a2cf5a182233-client-ca-bundle\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.836460 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.836431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4d11a1ca-4f07-474f-a603-a2cf5a182233-secret-metrics-server-tls\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.836610 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.836483 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4d11a1ca-4f07-474f-a603-a2cf5a182233-audit-log\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.836610 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.836513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4d11a1ca-4f07-474f-a603-a2cf5a182233-metrics-server-audit-profiles\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.836610 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.836536 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d11a1ca-4f07-474f-a603-a2cf5a182233-client-ca-bundle\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.836610 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.836575 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4d11a1ca-4f07-474f-a603-a2cf5a182233-secret-metrics-server-client-certs\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.836822 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.836612 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh9wv\" (UniqueName: \"kubernetes.io/projected/4d11a1ca-4f07-474f-a603-a2cf5a182233-kube-api-access-bh9wv\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.836822 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.836648 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d11a1ca-4f07-474f-a603-a2cf5a182233-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.836929 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.836871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4d11a1ca-4f07-474f-a603-a2cf5a182233-audit-log\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.837373 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.837346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d11a1ca-4f07-474f-a603-a2cf5a182233-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.837592 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.837567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4d11a1ca-4f07-474f-a603-a2cf5a182233-metrics-server-audit-profiles\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.839078 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.839054 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4d11a1ca-4f07-474f-a603-a2cf5a182233-secret-metrics-server-client-certs\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.839194 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.839083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4d11a1ca-4f07-474f-a603-a2cf5a182233-secret-metrics-server-tls\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.839194 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.839083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d11a1ca-4f07-474f-a603-a2cf5a182233-client-ca-bundle\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.844807 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.844771 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh9wv\" (UniqueName: \"kubernetes.io/projected/4d11a1ca-4f07-474f-a603-a2cf5a182233-kube-api-access-bh9wv\") pod \"metrics-server-64978bc4d-47kr9\" (UID: \"4d11a1ca-4f07-474f-a603-a2cf5a182233\") " pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:47.866160 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.866140 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8"] Apr 17 07:54:47.870336 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.870320 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" Apr 17 07:54:47.873005 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.872986 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-vsbdf\"" Apr 17 07:54:47.873072 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.872991 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 07:54:47.876566 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.876546 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8"] Apr 17 07:54:47.937740 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.937668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/33b34eda-688e-48f5-83f8-7589fa815bfa-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-q6kz8\" (UID: \"33b34eda-688e-48f5-83f8-7589fa815bfa\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" Apr 17 07:54:47.937868 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:47.937820 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:54:48.038392 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.038367 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/33b34eda-688e-48f5-83f8-7589fa815bfa-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-q6kz8\" (UID: \"33b34eda-688e-48f5-83f8-7589fa815bfa\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" Apr 17 07:54:48.038533 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:48.038503 2573 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 07:54:48.038588 ip-10-0-137-8 kubenswrapper[2573]: E0417 07:54:48.038563 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33b34eda-688e-48f5-83f8-7589fa815bfa-monitoring-plugin-cert podName:33b34eda-688e-48f5-83f8-7589fa815bfa nodeName:}" failed. No retries permitted until 2026-04-17 07:54:48.538549442 +0000 UTC m=+172.284606217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/33b34eda-688e-48f5-83f8-7589fa815bfa-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-q6kz8" (UID: "33b34eda-688e-48f5-83f8-7589fa815bfa") : secret "monitoring-plugin-cert" not found Apr 17 07:54:48.085026 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.084990 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64978bc4d-47kr9"] Apr 17 07:54:48.087680 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:48.087655 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d11a1ca_4f07_474f_a603_a2cf5a182233.slice/crio-12677b0c8937b6bab81b5e71fb2c3051c5d463bd18d6aa64bbd93cb622a4a837 WatchSource:0}: Error finding container 12677b0c8937b6bab81b5e71fb2c3051c5d463bd18d6aa64bbd93cb622a4a837: Status 404 returned error can't find the container with id 12677b0c8937b6bab81b5e71fb2c3051c5d463bd18d6aa64bbd93cb622a4a837 Apr 17 07:54:48.298326 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.298254 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jvx7c" Apr 17 07:54:48.335188 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.335141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cj7gw" event={"ID":"faa9121c-e579-414f-8d7d-77beba5608ea","Type":"ContainerStarted","Data":"680f8fca80900a39be41d4d903bf29cf018e30d326e5564d3a604a08acff1c9c"} Apr 17 07:54:48.336305 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.336282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" event={"ID":"4d11a1ca-4f07-474f-a603-a2cf5a182233","Type":"ContainerStarted","Data":"12677b0c8937b6bab81b5e71fb2c3051c5d463bd18d6aa64bbd93cb622a4a837"} Apr 17 07:54:48.356017 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.355978 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cj7gw" podStartSLOduration=139.893391367 podStartE2EDuration="2m21.355965013s" podCreationTimestamp="2026-04-17 07:52:27 +0000 UTC" firstStartedPulling="2026-04-17 07:54:45.923674665 +0000 UTC m=+169.669731451" lastFinishedPulling="2026-04-17 07:54:47.386248326 +0000 UTC m=+171.132305097" observedRunningTime="2026-04-17 07:54:48.355806283 +0000 UTC m=+172.101863077" watchObservedRunningTime="2026-04-17 07:54:48.355965013 +0000 UTC m=+172.102021806" Apr 17 07:54:48.542324 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.542277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/33b34eda-688e-48f5-83f8-7589fa815bfa-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-q6kz8\" (UID: \"33b34eda-688e-48f5-83f8-7589fa815bfa\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" Apr 17 07:54:48.545087 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.545055 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/33b34eda-688e-48f5-83f8-7589fa815bfa-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-q6kz8\" (UID: \"33b34eda-688e-48f5-83f8-7589fa815bfa\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" Apr 17 07:54:48.780612 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.780579 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" Apr 17 07:54:48.924487 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:48.924458 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8"] Apr 17 07:54:48.927553 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:48.927526 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b34eda_688e_48f5_83f8_7589fa815bfa.slice/crio-c18103d68115503a3b1dfbe1fe9caf858212439ae4ed4e2b9ee6de16fe5690de WatchSource:0}: Error finding container c18103d68115503a3b1dfbe1fe9caf858212439ae4ed4e2b9ee6de16fe5690de: Status 404 returned error can't find the container with id c18103d68115503a3b1dfbe1fe9caf858212439ae4ed4e2b9ee6de16fe5690de Apr 17 07:54:49.342655 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.342616 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" event={"ID":"33b34eda-688e-48f5-83f8-7589fa815bfa","Type":"ContainerStarted","Data":"c18103d68115503a3b1dfbe1fe9caf858212439ae4ed4e2b9ee6de16fe5690de"} Apr 17 07:54:49.412208 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.412179 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:54:49.416397 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.416376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.419896 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.419874 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 07:54:49.420378 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.420348 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 07:54:49.421650 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.421615 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 07:54:49.421771 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.421650 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 07:54:49.421965 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.421944 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 07:54:49.423033 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.423012 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 07:54:49.423330 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.423305 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 07:54:49.423488 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.423464 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 07:54:49.423667 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.423652 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e4ftp2752vopt\"" Apr 17 07:54:49.423906 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.423889 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fkhc5\"" Apr 17 07:54:49.424260 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.424109 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 07:54:49.424260 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.424132 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 07:54:49.425430 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.425404 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 07:54:49.428114 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.428083 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 07:54:49.432594 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.432521 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:54:49.551688 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.551654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.551885 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.551704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.551885 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.551735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.551885 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.551761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.551885 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.551813 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjzw7\" (UniqueName: \"kubernetes.io/projected/0967a60a-1d6e-48fc-b498-bd55233e70b0-kube-api-access-kjzw7\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.551885 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.551876 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552154 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.551943 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0967a60a-1d6e-48fc-b498-bd55233e70b0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552154 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.551991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552154 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-web-config\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552154 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552154 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552154 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0967a60a-1d6e-48fc-b498-bd55233e70b0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0967a60a-1d6e-48fc-b498-bd55233e70b0-config-out\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-config\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552271 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552324 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.552439 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.552349 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653191 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-config\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653373 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653373 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653373 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653266 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653373 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653553 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653553 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653553 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653553 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjzw7\" (UniqueName: \"kubernetes.io/projected/0967a60a-1d6e-48fc-b498-bd55233e70b0-kube-api-access-kjzw7\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653553 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653531 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653784 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0967a60a-1d6e-48fc-b498-bd55233e70b0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653784 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653784 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-web-config\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653784 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653784 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653709 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653784 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0967a60a-1d6e-48fc-b498-bd55233e70b0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.653784 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0967a60a-1d6e-48fc-b498-bd55233e70b0-config-out\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.654130 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.653826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.654536 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.654509 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.655246 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.655222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0967a60a-1d6e-48fc-b498-bd55233e70b0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.656112 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.656064 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.656509 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.656427 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.656680 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.656656 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.657167 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.657136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.659089 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.658988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.659089 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.659002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0967a60a-1d6e-48fc-b498-bd55233e70b0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.659450 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.659276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.659978 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.659955 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.660502 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.660479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.661160 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.661107 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0967a60a-1d6e-48fc-b498-bd55233e70b0-config-out\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.661621 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.661595 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.661731 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.661711 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-web-config\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.662288 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.662246 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-config\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.662576 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.662554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0967a60a-1d6e-48fc-b498-bd55233e70b0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.663709 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.663688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0967a60a-1d6e-48fc-b498-bd55233e70b0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.664520 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.664496 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjzw7\" (UniqueName: \"kubernetes.io/projected/0967a60a-1d6e-48fc-b498-bd55233e70b0-kube-api-access-kjzw7\") pod \"prometheus-k8s-0\" (UID: \"0967a60a-1d6e-48fc-b498-bd55233e70b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.730641 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.730607 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:49.890966 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:49.890797 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:54:50.102509 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:54:50.102433 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0967a60a_1d6e_48fc_b498_bd55233e70b0.slice/crio-7d8f412f4e806abe13508be05acba2b9b2fdab9b91e255b391af2514d35d7528 WatchSource:0}: Error finding container 7d8f412f4e806abe13508be05acba2b9b2fdab9b91e255b391af2514d35d7528: Status 404 returned error can't find the container with id 7d8f412f4e806abe13508be05acba2b9b2fdab9b91e255b391af2514d35d7528 Apr 17 07:54:50.348369 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:50.348333 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" event={"ID":"4d11a1ca-4f07-474f-a603-a2cf5a182233","Type":"ContainerStarted","Data":"5d0f8597d14f33752511d0bc127344a4c1c8f3b2937ea4aa8393a9a52694228b"} Apr 17 07:54:50.349438 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:50.349410 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0967a60a-1d6e-48fc-b498-bd55233e70b0","Type":"ContainerStarted","Data":"7d8f412f4e806abe13508be05acba2b9b2fdab9b91e255b391af2514d35d7528"} Apr 17 07:54:50.350557 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:50.350536 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" event={"ID":"33b34eda-688e-48f5-83f8-7589fa815bfa","Type":"ContainerStarted","Data":"d0e77342dd1449a90bfb9112f1c580009541978483f9d48a94a527f4c1791edb"} Apr 17 07:54:50.350823 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:50.350805 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" Apr 17 07:54:50.355204 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:50.355129 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" Apr 17 07:54:50.365938 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:50.365901 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" podStartSLOduration=2.117689268 podStartE2EDuration="3.365887526s" podCreationTimestamp="2026-04-17 07:54:47 +0000 UTC" firstStartedPulling="2026-04-17 07:54:48.089602038 +0000 UTC m=+171.835658809" lastFinishedPulling="2026-04-17 07:54:49.337800293 +0000 UTC m=+173.083857067" observedRunningTime="2026-04-17 07:54:50.36550566 +0000 UTC m=+174.111562464" watchObservedRunningTime="2026-04-17 07:54:50.365887526 +0000 UTC m=+174.111944322" Apr 17 07:54:50.383298 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:50.383262 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-q6kz8" podStartSLOduration=2.1666774699999998 podStartE2EDuration="3.383252124s" podCreationTimestamp="2026-04-17 07:54:47 +0000 UTC" firstStartedPulling="2026-04-17 07:54:48.929743143 +0000 UTC m=+172.675799915" lastFinishedPulling="2026-04-17 07:54:50.146317796 +0000 UTC m=+173.892374569" observedRunningTime="2026-04-17 07:54:50.38228645 +0000 UTC m=+174.128343242" watchObservedRunningTime="2026-04-17 07:54:50.383252124 +0000 UTC m=+174.129308918" Apr 17 07:54:51.354638 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:51.354533 2573 generic.go:358] "Generic (PLEG): container finished" podID="0967a60a-1d6e-48fc-b498-bd55233e70b0" containerID="66bbdd676f3cf60cb40cdce25316c228b0b9c0584b9754f3af8a8c575037bf93" exitCode=0 Apr 17 07:54:51.354638 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:51.354626 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0967a60a-1d6e-48fc-b498-bd55233e70b0","Type":"ContainerDied","Data":"66bbdd676f3cf60cb40cdce25316c228b0b9c0584b9754f3af8a8c575037bf93"} Apr 17 07:54:54.366695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:54.366661 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0967a60a-1d6e-48fc-b498-bd55233e70b0","Type":"ContainerStarted","Data":"69f1af824dc8d99dc516a3bb03c639fee35f16426aedb75f033f84e7532c9c97"} Apr 17 07:54:54.366695 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:54.366699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0967a60a-1d6e-48fc-b498-bd55233e70b0","Type":"ContainerStarted","Data":"bad88c870adbf5d5c19c2d8ab94d0d1cd59bc09895011af6ec5e0ab725215097"} Apr 17 07:54:56.374724 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:56.374687 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0967a60a-1d6e-48fc-b498-bd55233e70b0","Type":"ContainerStarted","Data":"d519704b1016bbeab69224364e6ebbd7bc5ab6979e746fdcb335313408010a04"} Apr 17 07:54:56.374724 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:56.374721 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0967a60a-1d6e-48fc-b498-bd55233e70b0","Type":"ContainerStarted","Data":"e7e48b3ed87a7901ff614d2ba0008bfecca3760320febb1a5d73568645aa0df2"} Apr 17 07:54:56.374724 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:56.374731 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0967a60a-1d6e-48fc-b498-bd55233e70b0","Type":"ContainerStarted","Data":"fa62b5a9e677193fb4d56465af051b48fb32ce2b3b8a4dc2726d94ca5ea6b2bf"} Apr 17 07:54:56.375334 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:56.374740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0967a60a-1d6e-48fc-b498-bd55233e70b0","Type":"ContainerStarted","Data":"265235a071a800a3d1fc5a4463d96bb2422c5e9dc5d811e784457d84ae816207"} Apr 17 07:54:56.406137 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:56.406092 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.088763108 podStartE2EDuration="7.406078749s" podCreationTimestamp="2026-04-17 07:54:49 +0000 UTC" firstStartedPulling="2026-04-17 07:54:50.104218741 +0000 UTC m=+173.850275516" lastFinishedPulling="2026-04-17 07:54:55.421534382 +0000 UTC m=+179.167591157" observedRunningTime="2026-04-17 07:54:56.403563611 +0000 UTC m=+180.149620438" watchObservedRunningTime="2026-04-17 07:54:56.406078749 +0000 UTC m=+180.152135543" Apr 17 07:54:59.731725 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:54:59.731694 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:07.938832 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:07.938783 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:55:07.938832 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:07.938841 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:55:27.943711 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:27.943679 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:55:27.947640 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:27.947613 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-64978bc4d-47kr9" Apr 17 07:55:36.493453 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:36.493419 2573 generic.go:358] "Generic (PLEG): container finished" podID="1556503a-a6b4-41c1-a88a-9476a85f4420" containerID="d680ea99ad9922cf8379eec77fdc8bf5220c63bea0bbb72b633585384f8b0f64" exitCode=0 Apr 17 07:55:36.493851 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:36.493472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" event={"ID":"1556503a-a6b4-41c1-a88a-9476a85f4420","Type":"ContainerDied","Data":"d680ea99ad9922cf8379eec77fdc8bf5220c63bea0bbb72b633585384f8b0f64"} Apr 17 07:55:36.493851 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:36.493824 2573 scope.go:117] "RemoveContainer" containerID="d680ea99ad9922cf8379eec77fdc8bf5220c63bea0bbb72b633585384f8b0f64" Apr 17 07:55:37.497636 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:37.497601 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-smsf7" event={"ID":"1556503a-a6b4-41c1-a88a-9476a85f4420","Type":"ContainerStarted","Data":"c50116591bbf9d88f24c620eae17ad82de5b9fb955a7311b71bf08db4ff49f0b"} Apr 17 07:55:49.731086 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:49.731042 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:49.750187 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:49.750161 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:50.549370 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:55:50.549339 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:56:07.516014 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:07.515984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:56:07.518187 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:07.518166 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341e9133-613e-45d4-bb0a-a187c93be340-metrics-certs\") pod \"network-metrics-daemon-ht68l\" (UID: \"341e9133-613e-45d4-bb0a-a187c93be340\") " pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:56:07.804769 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:07.804681 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ctqdx\"" Apr 17 07:56:07.812402 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:07.812382 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ht68l" Apr 17 07:56:07.928395 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:07.928372 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ht68l"] Apr 17 07:56:07.930816 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:56:07.930777 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod341e9133_613e_45d4_bb0a_a187c93be340.slice/crio-558dd540dc6df014a28d607a0fcc9b7212dcb1be3f9a45e106526045cc554cd4 WatchSource:0}: Error finding container 558dd540dc6df014a28d607a0fcc9b7212dcb1be3f9a45e106526045cc554cd4: Status 404 returned error can't find the container with id 558dd540dc6df014a28d607a0fcc9b7212dcb1be3f9a45e106526045cc554cd4 Apr 17 07:56:08.586250 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:08.586220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ht68l" event={"ID":"341e9133-613e-45d4-bb0a-a187c93be340","Type":"ContainerStarted","Data":"558dd540dc6df014a28d607a0fcc9b7212dcb1be3f9a45e106526045cc554cd4"} Apr 17 07:56:09.590884 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:09.590774 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ht68l" event={"ID":"341e9133-613e-45d4-bb0a-a187c93be340","Type":"ContainerStarted","Data":"a3eb425c007f7bba580d88ab10e276cea75c93c3315b036860f28eedefb5860b"} Apr 17 07:56:09.590884 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:09.590833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ht68l" event={"ID":"341e9133-613e-45d4-bb0a-a187c93be340","Type":"ContainerStarted","Data":"8ba78ff43c741c78f7ce41a854ab5296c233a1bb944a8230ec46cf50ee2a647f"} Apr 17 07:56:09.605642 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:09.605595 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ht68l" podStartSLOduration=252.67476517 podStartE2EDuration="4m13.605580461s" podCreationTimestamp="2026-04-17 07:51:56 +0000 UTC" firstStartedPulling="2026-04-17 07:56:07.932957474 +0000 UTC m=+251.679014246" lastFinishedPulling="2026-04-17 07:56:08.863772765 +0000 UTC m=+252.609829537" observedRunningTime="2026-04-17 07:56:09.604508686 +0000 UTC m=+253.350565483" watchObservedRunningTime="2026-04-17 07:56:09.605580461 +0000 UTC m=+253.351637255" Apr 17 07:56:56.690610 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:56.690580 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 07:56:56.692114 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:56.692090 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 07:56:56.696486 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:56:56.696466 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 07:59:04.384154 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.384122 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-gn9l4"] Apr 17 07:59:04.387388 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.387370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-gn9l4" Apr 17 07:59:04.389719 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.389698 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 07:59:04.389852 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.389738 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-j2gg4\"" Apr 17 07:59:04.390743 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.390726 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 07:59:04.396472 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.396455 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-gn9l4"] Apr 17 07:59:04.507094 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.507063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pghxq\" (UniqueName: \"kubernetes.io/projected/e639118f-9d78-4727-9836-e893e312f0db-kube-api-access-pghxq\") pod \"cert-manager-759f64656b-gn9l4\" (UID: \"e639118f-9d78-4727-9836-e893e312f0db\") " pod="cert-manager/cert-manager-759f64656b-gn9l4" Apr 17 07:59:04.507306 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.507117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e639118f-9d78-4727-9836-e893e312f0db-bound-sa-token\") pod \"cert-manager-759f64656b-gn9l4\" (UID: \"e639118f-9d78-4727-9836-e893e312f0db\") " pod="cert-manager/cert-manager-759f64656b-gn9l4" Apr 17 07:59:04.607512 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.607479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pghxq\" (UniqueName: \"kubernetes.io/projected/e639118f-9d78-4727-9836-e893e312f0db-kube-api-access-pghxq\") pod \"cert-manager-759f64656b-gn9l4\" (UID: \"e639118f-9d78-4727-9836-e893e312f0db\") " pod="cert-manager/cert-manager-759f64656b-gn9l4" Apr 17 07:59:04.607654 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.607528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e639118f-9d78-4727-9836-e893e312f0db-bound-sa-token\") pod \"cert-manager-759f64656b-gn9l4\" (UID: \"e639118f-9d78-4727-9836-e893e312f0db\") " pod="cert-manager/cert-manager-759f64656b-gn9l4" Apr 17 07:59:04.616857 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.616827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e639118f-9d78-4727-9836-e893e312f0db-bound-sa-token\") pod \"cert-manager-759f64656b-gn9l4\" (UID: \"e639118f-9d78-4727-9836-e893e312f0db\") " pod="cert-manager/cert-manager-759f64656b-gn9l4" Apr 17 07:59:04.617424 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.617408 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pghxq\" (UniqueName: \"kubernetes.io/projected/e639118f-9d78-4727-9836-e893e312f0db-kube-api-access-pghxq\") pod \"cert-manager-759f64656b-gn9l4\" (UID: \"e639118f-9d78-4727-9836-e893e312f0db\") " pod="cert-manager/cert-manager-759f64656b-gn9l4" Apr 17 07:59:04.703618 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.703534 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-gn9l4" Apr 17 07:59:04.821381 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.821341 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-gn9l4"] Apr 17 07:59:04.824103 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:59:04.824076 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode639118f_9d78_4727_9836_e893e312f0db.slice/crio-ef84a57bbbb5e4c24d69dbde0f7d4b8e349b85a08574d002f79a2cee455ec039 WatchSource:0}: Error finding container ef84a57bbbb5e4c24d69dbde0f7d4b8e349b85a08574d002f79a2cee455ec039: Status 404 returned error can't find the container with id ef84a57bbbb5e4c24d69dbde0f7d4b8e349b85a08574d002f79a2cee455ec039 Apr 17 07:59:04.825910 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:04.825893 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:59:05.073515 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:05.073416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-gn9l4" event={"ID":"e639118f-9d78-4727-9836-e893e312f0db","Type":"ContainerStarted","Data":"ef84a57bbbb5e4c24d69dbde0f7d4b8e349b85a08574d002f79a2cee455ec039"} Apr 17 07:59:08.085352 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:08.085312 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-gn9l4" event={"ID":"e639118f-9d78-4727-9836-e893e312f0db","Type":"ContainerStarted","Data":"2c2c07f73f15df46e2df049ca43595d5dcf50e7782426971621fdb89d821db3c"} Apr 17 07:59:34.831372 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.831303 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-gn9l4" podStartSLOduration=27.803185533 podStartE2EDuration="30.831286396s" podCreationTimestamp="2026-04-17 07:59:04 +0000 UTC" firstStartedPulling="2026-04-17 07:59:04.826019799 +0000 UTC m=+428.572076570" lastFinishedPulling="2026-04-17 07:59:07.854120661 +0000 UTC m=+431.600177433" observedRunningTime="2026-04-17 07:59:08.10828324 +0000 UTC m=+431.854340039" watchObservedRunningTime="2026-04-17 07:59:34.831286396 +0000 UTC m=+458.577343198" Apr 17 07:59:34.831821 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.831804 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch"] Apr 17 07:59:34.838927 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.838907 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:34.841979 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.841962 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 07:59:34.842846 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.842826 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 07:59:34.843183 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.843169 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 07:59:34.843238 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.843172 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 07:59:34.844325 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.844308 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:59:34.854519 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.854496 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-w2djl\"" Apr 17 07:59:34.859449 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.859427 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch"] Apr 17 07:59:34.941351 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.941312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c8ec095-5266-47dc-bd56-c3e429562206-cert\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:34.941351 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.941354 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sftl\" (UniqueName: \"kubernetes.io/projected/3c8ec095-5266-47dc-bd56-c3e429562206-kube-api-access-7sftl\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:34.941509 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.941478 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3c8ec095-5266-47dc-bd56-c3e429562206-manager-config\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:34.941545 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:34.941508 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c8ec095-5266-47dc-bd56-c3e429562206-metrics-cert\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.042365 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.042317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c8ec095-5266-47dc-bd56-c3e429562206-cert\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.042560 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.042381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sftl\" (UniqueName: \"kubernetes.io/projected/3c8ec095-5266-47dc-bd56-c3e429562206-kube-api-access-7sftl\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.042560 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.042488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3c8ec095-5266-47dc-bd56-c3e429562206-manager-config\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.042678 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.042613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c8ec095-5266-47dc-bd56-c3e429562206-metrics-cert\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.043332 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.043303 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3c8ec095-5266-47dc-bd56-c3e429562206-manager-config\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.045170 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.045136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c8ec095-5266-47dc-bd56-c3e429562206-cert\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.045294 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.045139 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c8ec095-5266-47dc-bd56-c3e429562206-metrics-cert\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.060655 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.060626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sftl\" (UniqueName: \"kubernetes.io/projected/3c8ec095-5266-47dc-bd56-c3e429562206-kube-api-access-7sftl\") pod \"lws-controller-manager-5c76446df-cw2ch\" (UID: \"3c8ec095-5266-47dc-bd56-c3e429562206\") " pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.148219 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.148164 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:35.284773 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:35.284743 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch"] Apr 17 07:59:35.288030 ip-10-0-137-8 kubenswrapper[2573]: W0417 07:59:35.288000 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c8ec095_5266_47dc_bd56_c3e429562206.slice/crio-bc380e3838bef11962e43e2d1895fb973071aa37f9070909b0be71af3d0de9c4 WatchSource:0}: Error finding container bc380e3838bef11962e43e2d1895fb973071aa37f9070909b0be71af3d0de9c4: Status 404 returned error can't find the container with id bc380e3838bef11962e43e2d1895fb973071aa37f9070909b0be71af3d0de9c4 Apr 17 07:59:36.173858 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:36.173814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" event={"ID":"3c8ec095-5266-47dc-bd56-c3e429562206","Type":"ContainerStarted","Data":"bc380e3838bef11962e43e2d1895fb973071aa37f9070909b0be71af3d0de9c4"} Apr 17 07:59:38.181783 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:38.181748 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" event={"ID":"3c8ec095-5266-47dc-bd56-c3e429562206","Type":"ContainerStarted","Data":"d83b3dab34852a830df803815105eedcedb4398188a845bb8580ee2f317d5d3c"} Apr 17 07:59:38.182219 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:38.181862 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:49.186458 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:49.186424 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" Apr 17 07:59:49.204472 ip-10-0-137-8 kubenswrapper[2573]: I0417 07:59:49.204417 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5c76446df-cw2ch" podStartSLOduration=12.993580589 podStartE2EDuration="15.204406297s" podCreationTimestamp="2026-04-17 07:59:34 +0000 UTC" firstStartedPulling="2026-04-17 07:59:35.289642628 +0000 UTC m=+459.035699400" lastFinishedPulling="2026-04-17 07:59:37.500468336 +0000 UTC m=+461.246525108" observedRunningTime="2026-04-17 07:59:38.208273645 +0000 UTC m=+461.954330439" watchObservedRunningTime="2026-04-17 07:59:49.204406297 +0000 UTC m=+472.950463093" Apr 17 08:00:28.709808 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.709757 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq"] Apr 17 08:00:28.713113 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.713096 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" Apr 17 08:00:28.717943 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.717920 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 08:00:28.718050 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.717986 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-m2krq\"" Apr 17 08:00:28.718050 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.718005 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 08:00:28.718735 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.718715 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 08:00:28.741699 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.741678 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq"] Apr 17 08:00:28.767823 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.767773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8lgr\" (UniqueName: \"kubernetes.io/projected/992920b3-1fa1-430b-a57d-ddc1fa16986c-kube-api-access-r8lgr\") pod \"dns-operator-controller-manager-844548ff4c-bb2rq\" (UID: \"992920b3-1fa1-430b-a57d-ddc1fa16986c\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" Apr 17 08:00:28.868321 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.868281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8lgr\" (UniqueName: \"kubernetes.io/projected/992920b3-1fa1-430b-a57d-ddc1fa16986c-kube-api-access-r8lgr\") pod \"dns-operator-controller-manager-844548ff4c-bb2rq\" (UID: \"992920b3-1fa1-430b-a57d-ddc1fa16986c\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" Apr 17 08:00:28.878173 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:28.878136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8lgr\" (UniqueName: \"kubernetes.io/projected/992920b3-1fa1-430b-a57d-ddc1fa16986c-kube-api-access-r8lgr\") pod \"dns-operator-controller-manager-844548ff4c-bb2rq\" (UID: \"992920b3-1fa1-430b-a57d-ddc1fa16986c\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" Apr 17 08:00:29.023817 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:29.023693 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" Apr 17 08:00:29.144700 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:29.144674 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq"] Apr 17 08:00:29.146837 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:00:29.146806 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod992920b3_1fa1_430b_a57d_ddc1fa16986c.slice/crio-02dda3fdf1fcad320a5c53fc054c0902557910959576e50ccde9646796722e18 WatchSource:0}: Error finding container 02dda3fdf1fcad320a5c53fc054c0902557910959576e50ccde9646796722e18: Status 404 returned error can't find the container with id 02dda3fdf1fcad320a5c53fc054c0902557910959576e50ccde9646796722e18 Apr 17 08:00:29.325175 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:29.325092 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" event={"ID":"992920b3-1fa1-430b-a57d-ddc1fa16986c","Type":"ContainerStarted","Data":"02dda3fdf1fcad320a5c53fc054c0902557910959576e50ccde9646796722e18"} Apr 17 08:00:30.918045 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:30.918007 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr"] Apr 17 08:00:30.927397 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:30.927171 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" Apr 17 08:00:30.929593 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:30.929563 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr"] Apr 17 08:00:30.929885 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:30.929863 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-j2qjc\"" Apr 17 08:00:30.986687 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:30.986649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkpqj\" (UniqueName: \"kubernetes.io/projected/178d1a1b-5cd7-447e-8e8a-b24968742805-kube-api-access-hkpqj\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rpjbr\" (UID: \"178d1a1b-5cd7-447e-8e8a-b24968742805\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" Apr 17 08:00:31.087929 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:31.087891 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkpqj\" (UniqueName: \"kubernetes.io/projected/178d1a1b-5cd7-447e-8e8a-b24968742805-kube-api-access-hkpqj\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rpjbr\" (UID: \"178d1a1b-5cd7-447e-8e8a-b24968742805\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" Apr 17 08:00:31.096558 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:31.096533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkpqj\" (UniqueName: \"kubernetes.io/projected/178d1a1b-5cd7-447e-8e8a-b24968742805-kube-api-access-hkpqj\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rpjbr\" (UID: \"178d1a1b-5cd7-447e-8e8a-b24968742805\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" Apr 17 08:00:31.240457 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:31.240378 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" Apr 17 08:00:31.736572 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:31.736546 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr"] Apr 17 08:00:31.740937 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:00:31.740911 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod178d1a1b_5cd7_447e_8e8a_b24968742805.slice/crio-baeb6e962dcaa7e1242fb29396c8f7dabe1a8df9be141b06816ce02858516f72 WatchSource:0}: Error finding container baeb6e962dcaa7e1242fb29396c8f7dabe1a8df9be141b06816ce02858516f72: Status 404 returned error can't find the container with id baeb6e962dcaa7e1242fb29396c8f7dabe1a8df9be141b06816ce02858516f72 Apr 17 08:00:32.336991 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:32.336945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" event={"ID":"178d1a1b-5cd7-447e-8e8a-b24968742805","Type":"ContainerStarted","Data":"baeb6e962dcaa7e1242fb29396c8f7dabe1a8df9be141b06816ce02858516f72"} Apr 17 08:00:32.338202 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:32.338174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" event={"ID":"992920b3-1fa1-430b-a57d-ddc1fa16986c","Type":"ContainerStarted","Data":"75c6e7129fa7c9a3f21389f4b32e6c5bcf31ee0dba971d6687bac76c3a88aa44"} Apr 17 08:00:32.338310 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:32.338295 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" Apr 17 08:00:34.346347 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:34.346313 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" event={"ID":"178d1a1b-5cd7-447e-8e8a-b24968742805","Type":"ContainerStarted","Data":"a2b6e2c28c327cae183339c035d38d5eca69c610b32a5483cde2565018270c9a"} Apr 17 08:00:34.346705 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:34.346419 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" Apr 17 08:00:34.370626 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:34.370579 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" podStartSLOduration=2.762256485 podStartE2EDuration="4.370564712s" podCreationTimestamp="2026-04-17 08:00:30 +0000 UTC" firstStartedPulling="2026-04-17 08:00:31.743453055 +0000 UTC m=+515.489509832" lastFinishedPulling="2026-04-17 08:00:33.351761288 +0000 UTC m=+517.097818059" observedRunningTime="2026-04-17 08:00:34.370054877 +0000 UTC m=+518.116111680" watchObservedRunningTime="2026-04-17 08:00:34.370564712 +0000 UTC m=+518.116621505" Apr 17 08:00:34.371123 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:34.371090 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" podStartSLOduration=3.848065207 podStartE2EDuration="6.371080665s" podCreationTimestamp="2026-04-17 08:00:28 +0000 UTC" firstStartedPulling="2026-04-17 08:00:29.149069797 +0000 UTC m=+512.895126570" lastFinishedPulling="2026-04-17 08:00:31.67208524 +0000 UTC m=+515.418142028" observedRunningTime="2026-04-17 08:00:32.389919629 +0000 UTC m=+516.135976423" watchObservedRunningTime="2026-04-17 08:00:34.371080665 +0000 UTC m=+518.117137459" Apr 17 08:00:43.344631 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:43.344556 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-bb2rq" Apr 17 08:00:45.351349 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:00:45.351320 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rpjbr" Apr 17 08:01:16.417649 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.417613 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-c7cs2"] Apr 17 08:01:16.420982 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.420963 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:16.423406 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.423379 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jxfvx\"" Apr 17 08:01:16.423484 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.423379 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 08:01:16.431805 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.430554 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-c7cs2"] Apr 17 08:01:16.460185 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.460158 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-c7cs2"] Apr 17 08:01:16.553067 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.553036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7vn5\" (UniqueName: \"kubernetes.io/projected/d348c7b3-e753-4884-83ae-71ef67146d47-kube-api-access-m7vn5\") pod \"limitador-limitador-67566c68b4-c7cs2\" (UID: \"d348c7b3-e753-4884-83ae-71ef67146d47\") " pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:16.553199 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.553087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d348c7b3-e753-4884-83ae-71ef67146d47-config-file\") pod \"limitador-limitador-67566c68b4-c7cs2\" (UID: \"d348c7b3-e753-4884-83ae-71ef67146d47\") " pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:16.653915 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.653883 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7vn5\" (UniqueName: \"kubernetes.io/projected/d348c7b3-e753-4884-83ae-71ef67146d47-kube-api-access-m7vn5\") pod \"limitador-limitador-67566c68b4-c7cs2\" (UID: \"d348c7b3-e753-4884-83ae-71ef67146d47\") " pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:16.654064 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.653927 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d348c7b3-e753-4884-83ae-71ef67146d47-config-file\") pod \"limitador-limitador-67566c68b4-c7cs2\" (UID: \"d348c7b3-e753-4884-83ae-71ef67146d47\") " pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:16.654458 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.654441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d348c7b3-e753-4884-83ae-71ef67146d47-config-file\") pod \"limitador-limitador-67566c68b4-c7cs2\" (UID: \"d348c7b3-e753-4884-83ae-71ef67146d47\") " pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:16.661510 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.661490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7vn5\" (UniqueName: \"kubernetes.io/projected/d348c7b3-e753-4884-83ae-71ef67146d47-kube-api-access-m7vn5\") pod \"limitador-limitador-67566c68b4-c7cs2\" (UID: \"d348c7b3-e753-4884-83ae-71ef67146d47\") " pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:16.735475 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.735404 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:16.856935 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:16.856899 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-c7cs2"] Apr 17 08:01:16.859733 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:01:16.859704 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd348c7b3_e753_4884_83ae_71ef67146d47.slice/crio-bc760bb9aeb847954238e7db1e26ee94e4a3b1377aa398327cc3b7ba2bd92777 WatchSource:0}: Error finding container bc760bb9aeb847954238e7db1e26ee94e4a3b1377aa398327cc3b7ba2bd92777: Status 404 returned error can't find the container with id bc760bb9aeb847954238e7db1e26ee94e4a3b1377aa398327cc3b7ba2bd92777 Apr 17 08:01:17.485519 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:17.485482 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" event={"ID":"d348c7b3-e753-4884-83ae-71ef67146d47","Type":"ContainerStarted","Data":"bc760bb9aeb847954238e7db1e26ee94e4a3b1377aa398327cc3b7ba2bd92777"} Apr 17 08:01:21.501621 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:21.501582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" event={"ID":"d348c7b3-e753-4884-83ae-71ef67146d47","Type":"ContainerStarted","Data":"421deb0e1879b2719547cfa375fef7d87aecff7776621d36083408e4537293c5"} Apr 17 08:01:21.502046 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:21.501744 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:21.519827 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:21.519760 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" podStartSLOduration=1.786204675 podStartE2EDuration="5.519747191s" podCreationTimestamp="2026-04-17 08:01:16 +0000 UTC" firstStartedPulling="2026-04-17 08:01:16.861499084 +0000 UTC m=+560.607555856" lastFinishedPulling="2026-04-17 08:01:20.595041596 +0000 UTC m=+564.341098372" observedRunningTime="2026-04-17 08:01:21.517782781 +0000 UTC m=+565.263839575" watchObservedRunningTime="2026-04-17 08:01:21.519747191 +0000 UTC m=+565.265804032" Apr 17 08:01:32.506191 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:32.506155 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-c7cs2" Apr 17 08:01:56.715968 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:56.715940 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:01:56.716695 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:01:56.716280 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:02:00.407983 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.407952 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-mbcj2"] Apr 17 08:02:00.426436 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.426411 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-mbcj2"] Apr 17 08:02:00.426573 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.426518 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:02:00.429479 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.429443 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 08:02:00.429625 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.429443 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 08:02:00.430585 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.430496 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-zgnsg\"" Apr 17 08:02:00.430585 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.430522 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 08:02:00.517563 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.517530 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9e59cfc6-5f20-419b-aab8-74be02eb7d00-data\") pod \"seaweedfs-86cc847c5c-mbcj2\" (UID: \"9e59cfc6-5f20-419b-aab8-74be02eb7d00\") " pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:02:00.517742 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.517573 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvdc\" (UniqueName: \"kubernetes.io/projected/9e59cfc6-5f20-419b-aab8-74be02eb7d00-kube-api-access-ptvdc\") pod \"seaweedfs-86cc847c5c-mbcj2\" (UID: \"9e59cfc6-5f20-419b-aab8-74be02eb7d00\") " pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:02:00.618773 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.618726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9e59cfc6-5f20-419b-aab8-74be02eb7d00-data\") pod \"seaweedfs-86cc847c5c-mbcj2\" (UID: \"9e59cfc6-5f20-419b-aab8-74be02eb7d00\") " pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:02:00.618949 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.618808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvdc\" (UniqueName: \"kubernetes.io/projected/9e59cfc6-5f20-419b-aab8-74be02eb7d00-kube-api-access-ptvdc\") pod \"seaweedfs-86cc847c5c-mbcj2\" (UID: \"9e59cfc6-5f20-419b-aab8-74be02eb7d00\") " pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:02:00.619147 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.619120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9e59cfc6-5f20-419b-aab8-74be02eb7d00-data\") pod \"seaweedfs-86cc847c5c-mbcj2\" (UID: \"9e59cfc6-5f20-419b-aab8-74be02eb7d00\") " pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:02:00.628099 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.628076 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvdc\" (UniqueName: \"kubernetes.io/projected/9e59cfc6-5f20-419b-aab8-74be02eb7d00-kube-api-access-ptvdc\") pod \"seaweedfs-86cc847c5c-mbcj2\" (UID: \"9e59cfc6-5f20-419b-aab8-74be02eb7d00\") " pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:02:00.736472 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.736392 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:02:00.857381 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:00.857357 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-mbcj2"] Apr 17 08:02:00.859717 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:02:00.859692 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e59cfc6_5f20_419b_aab8_74be02eb7d00.slice/crio-2e6aec7e2837b6bad4e6f9e20842d543f77d17390a98a85a25e3dec423ad2551 WatchSource:0}: Error finding container 2e6aec7e2837b6bad4e6f9e20842d543f77d17390a98a85a25e3dec423ad2551: Status 404 returned error can't find the container with id 2e6aec7e2837b6bad4e6f9e20842d543f77d17390a98a85a25e3dec423ad2551 Apr 17 08:02:01.630259 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:01.630219 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-mbcj2" event={"ID":"9e59cfc6-5f20-419b-aab8-74be02eb7d00","Type":"ContainerStarted","Data":"2e6aec7e2837b6bad4e6f9e20842d543f77d17390a98a85a25e3dec423ad2551"} Apr 17 08:02:04.640638 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:04.640602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-mbcj2" event={"ID":"9e59cfc6-5f20-419b-aab8-74be02eb7d00","Type":"ContainerStarted","Data":"f0290bbdd4dea4b580449b1d2aeac8b42d13cd3f206ae511182063410762ed18"} Apr 17 08:02:04.641020 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:04.640663 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:02:04.657997 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:04.657953 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-mbcj2" podStartSLOduration=1.766515383 podStartE2EDuration="4.657937547s" podCreationTimestamp="2026-04-17 08:02:00 +0000 UTC" firstStartedPulling="2026-04-17 08:02:00.861088863 +0000 UTC m=+604.607145648" lastFinishedPulling="2026-04-17 08:02:03.752511035 +0000 UTC m=+607.498567812" observedRunningTime="2026-04-17 08:02:04.657008264 +0000 UTC m=+608.403065058" watchObservedRunningTime="2026-04-17 08:02:04.657937547 +0000 UTC m=+608.403994342" Apr 17 08:02:10.648873 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:02:10.648777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-mbcj2" Apr 17 08:04:28.896976 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.896941 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2"] Apr 17 08:04:28.899326 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.899310 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:28.903008 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.902984 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:04:28.903140 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.903113 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 17 08:04:28.903209 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.903133 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 08:04:28.903266 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.903206 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:04:28.908715 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.908688 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2"] Apr 17 08:04:28.909984 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.909961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:28.910110 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.909991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:28.910110 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.910023 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/882a8e07-c7d0-4b2d-be49-381186691841-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:28.910110 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.910100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:28.910269 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.910142 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:28.910269 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:28.910163 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgtgn\" (UniqueName: \"kubernetes.io/projected/882a8e07-c7d0-4b2d-be49-381186691841-kube-api-access-qgtgn\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.010679 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.010639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/882a8e07-c7d0-4b2d-be49-381186691841-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.010892 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.010690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.010892 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.010720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.010892 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.010737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgtgn\" (UniqueName: \"kubernetes.io/projected/882a8e07-c7d0-4b2d-be49-381186691841-kube-api-access-qgtgn\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.010892 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.010764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.010892 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.010784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.011290 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.011268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.011290 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.011280 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.011408 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.011337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.013126 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.013102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.013260 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.013241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/882a8e07-c7d0-4b2d-be49-381186691841-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.018905 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.018882 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgtgn\" (UniqueName: \"kubernetes.io/projected/882a8e07-c7d0-4b2d-be49-381186691841-kube-api-access-qgtgn\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.210176 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.210093 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:29.334705 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.334679 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2"] Apr 17 08:04:29.336866 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:04:29.336839 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod882a8e07_c7d0_4b2d_be49_381186691841.slice/crio-4397b9a915477314727e30df509abdd0d9dfe889736ebb263afe6a2f354a2bba WatchSource:0}: Error finding container 4397b9a915477314727e30df509abdd0d9dfe889736ebb263afe6a2f354a2bba: Status 404 returned error can't find the container with id 4397b9a915477314727e30df509abdd0d9dfe889736ebb263afe6a2f354a2bba Apr 17 08:04:29.338738 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:29.338717 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:04:30.101450 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:30.101407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" event={"ID":"882a8e07-c7d0-4b2d-be49-381186691841","Type":"ContainerStarted","Data":"4397b9a915477314727e30df509abdd0d9dfe889736ebb263afe6a2f354a2bba"} Apr 17 08:04:33.113210 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:33.113168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" event={"ID":"882a8e07-c7d0-4b2d-be49-381186691841","Type":"ContainerStarted","Data":"2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f"} Apr 17 08:04:37.128552 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:37.128516 2573 generic.go:358] "Generic (PLEG): container finished" podID="882a8e07-c7d0-4b2d-be49-381186691841" containerID="2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f" exitCode=0 Apr 17 08:04:37.128954 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:37.128593 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" event={"ID":"882a8e07-c7d0-4b2d-be49-381186691841","Type":"ContainerDied","Data":"2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f"} Apr 17 08:04:39.136274 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:39.136240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" event={"ID":"882a8e07-c7d0-4b2d-be49-381186691841","Type":"ContainerStarted","Data":"22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10"} Apr 17 08:04:39.156087 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:39.156043 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" podStartSLOduration=2.1916693 podStartE2EDuration="11.15602932s" podCreationTimestamp="2026-04-17 08:04:28 +0000 UTC" firstStartedPulling="2026-04-17 08:04:29.338870206 +0000 UTC m=+753.084926978" lastFinishedPulling="2026-04-17 08:04:38.303230215 +0000 UTC m=+762.049286998" observedRunningTime="2026-04-17 08:04:39.154069246 +0000 UTC m=+762.900126052" watchObservedRunningTime="2026-04-17 08:04:39.15602932 +0000 UTC m=+762.902086114" Apr 17 08:04:39.210855 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:39.210830 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:39.211004 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:39.210867 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:39.223145 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:39.223122 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:04:40.151134 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:04:40.151101 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:05:00.848138 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:00.848104 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2"] Apr 17 08:05:00.848602 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:00.848388 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" podUID="882a8e07-c7d0-4b2d-be49-381186691841" containerName="main" containerID="cri-o://22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10" gracePeriod=30 Apr 17 08:05:01.105178 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.105121 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:05:01.201761 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.201725 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-model-cache\") pod \"882a8e07-c7d0-4b2d-be49-381186691841\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " Apr 17 08:05:01.201956 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.201848 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-home\") pod \"882a8e07-c7d0-4b2d-be49-381186691841\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " Apr 17 08:05:01.201956 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.201880 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgtgn\" (UniqueName: \"kubernetes.io/projected/882a8e07-c7d0-4b2d-be49-381186691841-kube-api-access-qgtgn\") pod \"882a8e07-c7d0-4b2d-be49-381186691841\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " Apr 17 08:05:01.201956 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.201906 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-kserve-provision-location\") pod \"882a8e07-c7d0-4b2d-be49-381186691841\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " Apr 17 08:05:01.201956 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.201927 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-dshm\") pod \"882a8e07-c7d0-4b2d-be49-381186691841\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " Apr 17 08:05:01.201956 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.201950 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/882a8e07-c7d0-4b2d-be49-381186691841-tls-certs\") pod \"882a8e07-c7d0-4b2d-be49-381186691841\" (UID: \"882a8e07-c7d0-4b2d-be49-381186691841\") " Apr 17 08:05:01.202213 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.202018 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-model-cache" (OuterVolumeSpecName: "model-cache") pod "882a8e07-c7d0-4b2d-be49-381186691841" (UID: "882a8e07-c7d0-4b2d-be49-381186691841"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:01.202213 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.202148 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-home" (OuterVolumeSpecName: "home") pod "882a8e07-c7d0-4b2d-be49-381186691841" (UID: "882a8e07-c7d0-4b2d-be49-381186691841"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:01.202213 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.202203 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:05:01.204237 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.204209 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882a8e07-c7d0-4b2d-be49-381186691841-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "882a8e07-c7d0-4b2d-be49-381186691841" (UID: "882a8e07-c7d0-4b2d-be49-381186691841"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:05:01.204237 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.204223 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-dshm" (OuterVolumeSpecName: "dshm") pod "882a8e07-c7d0-4b2d-be49-381186691841" (UID: "882a8e07-c7d0-4b2d-be49-381186691841"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:01.204376 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.204259 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882a8e07-c7d0-4b2d-be49-381186691841-kube-api-access-qgtgn" (OuterVolumeSpecName: "kube-api-access-qgtgn") pod "882a8e07-c7d0-4b2d-be49-381186691841" (UID: "882a8e07-c7d0-4b2d-be49-381186691841"). InnerVolumeSpecName "kube-api-access-qgtgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:05:01.209982 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.209738 2573 generic.go:358] "Generic (PLEG): container finished" podID="882a8e07-c7d0-4b2d-be49-381186691841" containerID="22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10" exitCode=0 Apr 17 08:05:01.209982 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.209824 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" Apr 17 08:05:01.209982 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.209817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" event={"ID":"882a8e07-c7d0-4b2d-be49-381186691841","Type":"ContainerDied","Data":"22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10"} Apr 17 08:05:01.209982 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.209928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2" event={"ID":"882a8e07-c7d0-4b2d-be49-381186691841","Type":"ContainerDied","Data":"4397b9a915477314727e30df509abdd0d9dfe889736ebb263afe6a2f354a2bba"} Apr 17 08:05:01.209982 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.209943 2573 scope.go:117] "RemoveContainer" containerID="22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10" Apr 17 08:05:01.222596 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.222578 2573 scope.go:117] "RemoveContainer" containerID="2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f" Apr 17 08:05:01.263202 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.263168 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "882a8e07-c7d0-4b2d-be49-381186691841" (UID: "882a8e07-c7d0-4b2d-be49-381186691841"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:01.284892 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.284862 2573 scope.go:117] "RemoveContainer" containerID="22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10" Apr 17 08:05:01.285216 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:05:01.285195 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10\": container with ID starting with 22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10 not found: ID does not exist" containerID="22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10" Apr 17 08:05:01.285266 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.285227 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10"} err="failed to get container status \"22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10\": rpc error: code = NotFound desc = could not find container \"22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10\": container with ID starting with 22ac477f4f4d34e1aaa3829a6df92d683d5468ce8e37c4c773424f55908a5f10 not found: ID does not exist" Apr 17 08:05:01.285266 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.285263 2573 scope.go:117] "RemoveContainer" containerID="2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f" Apr 17 08:05:01.285521 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:05:01.285505 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f\": container with ID starting with 2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f not found: ID does not exist" containerID="2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f" Apr 17 08:05:01.285606 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.285527 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f"} err="failed to get container status \"2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f\": rpc error: code = NotFound desc = could not find container \"2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f\": container with ID starting with 2af4901689f35da8c5c4c6afa9436b338b4398dd5e7e2abd6df1780232f3686f not found: ID does not exist" Apr 17 08:05:01.302974 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.302953 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:05:01.303026 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.302976 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgtgn\" (UniqueName: \"kubernetes.io/projected/882a8e07-c7d0-4b2d-be49-381186691841-kube-api-access-qgtgn\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:05:01.303026 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.302992 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:05:01.303026 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.303002 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/882a8e07-c7d0-4b2d-be49-381186691841-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:05:01.303026 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.303010 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/882a8e07-c7d0-4b2d-be49-381186691841-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:05:01.532969 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.532937 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2"] Apr 17 08:05:01.535842 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:01.535820 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6664dcf54ftwqj2"] Apr 17 08:05:02.802118 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:02.802082 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="882a8e07-c7d0-4b2d-be49-381186691841" path="/var/lib/kubelet/pods/882a8e07-c7d0-4b2d-be49-381186691841/volumes" Apr 17 08:05:13.405917 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.405880 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz"] Apr 17 08:05:13.406336 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.406247 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="882a8e07-c7d0-4b2d-be49-381186691841" containerName="storage-initializer" Apr 17 08:05:13.406336 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.406260 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="882a8e07-c7d0-4b2d-be49-381186691841" containerName="storage-initializer" Apr 17 08:05:13.406336 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.406276 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="882a8e07-c7d0-4b2d-be49-381186691841" containerName="main" Apr 17 08:05:13.406336 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.406282 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="882a8e07-c7d0-4b2d-be49-381186691841" containerName="main" Apr 17 08:05:13.406473 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.406339 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="882a8e07-c7d0-4b2d-be49-381186691841" containerName="main" Apr 17 08:05:13.409323 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.409304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.412053 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.412031 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:05:13.412227 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.412210 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 08:05:13.412281 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.412217 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 17 08:05:13.412281 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.412262 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:05:13.419107 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.419086 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz"] Apr 17 08:05:13.509496 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.509466 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.509645 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.509517 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.509645 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.509617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.509732 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.509649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwxdh\" (UniqueName: \"kubernetes.io/projected/3d92f950-9568-4765-a41a-5b3d534722af-kube-api-access-vwxdh\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.509732 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.509682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.509732 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.509701 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d92f950-9568-4765-a41a-5b3d534722af-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.610935 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.610894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.611128 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.610941 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwxdh\" (UniqueName: \"kubernetes.io/projected/3d92f950-9568-4765-a41a-5b3d534722af-kube-api-access-vwxdh\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.611128 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.610982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.611128 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.611008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d92f950-9568-4765-a41a-5b3d534722af-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.611128 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.611044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.611128 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.611104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.611387 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.611340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.611387 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.611350 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.611483 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.611447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.613274 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.613240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.613577 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.613560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d92f950-9568-4765-a41a-5b3d534722af-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.619246 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.619227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwxdh\" (UniqueName: \"kubernetes.io/projected/3d92f950-9568-4765-a41a-5b3d534722af-kube-api-access-vwxdh\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.720491 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.720407 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:13.846184 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:13.846155 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz"] Apr 17 08:05:13.847878 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:05:13.847849 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d92f950_9568_4765_a41a_5b3d534722af.slice/crio-8b2624eed28737b5a55944f1bece3ae7de05763494933548e49754ae1ce6e768 WatchSource:0}: Error finding container 8b2624eed28737b5a55944f1bece3ae7de05763494933548e49754ae1ce6e768: Status 404 returned error can't find the container with id 8b2624eed28737b5a55944f1bece3ae7de05763494933548e49754ae1ce6e768 Apr 17 08:05:14.256163 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:14.256129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" event={"ID":"3d92f950-9568-4765-a41a-5b3d534722af","Type":"ContainerStarted","Data":"942ac2154d13ba2b78676e09d19156d30af27877851f51e332de16c85edb9865"} Apr 17 08:05:14.256163 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:14.256167 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" event={"ID":"3d92f950-9568-4765-a41a-5b3d534722af","Type":"ContainerStarted","Data":"8b2624eed28737b5a55944f1bece3ae7de05763494933548e49754ae1ce6e768"} Apr 17 08:05:23.288137 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:23.288098 2573 generic.go:358] "Generic (PLEG): container finished" podID="3d92f950-9568-4765-a41a-5b3d534722af" containerID="942ac2154d13ba2b78676e09d19156d30af27877851f51e332de16c85edb9865" exitCode=0 Apr 17 08:05:23.288617 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:23.288175 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" event={"ID":"3d92f950-9568-4765-a41a-5b3d534722af","Type":"ContainerDied","Data":"942ac2154d13ba2b78676e09d19156d30af27877851f51e332de16c85edb9865"} Apr 17 08:05:36.026751 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.026715 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l"] Apr 17 08:05:36.044373 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.044299 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.045632 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.045589 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l"] Apr 17 08:05:36.047572 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.047412 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-ddw9n\"" Apr 17 08:05:36.047572 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.047431 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 17 08:05:36.133286 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.133252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.133286 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.133296 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlf4f\" (UniqueName: \"kubernetes.io/projected/13dceeff-6cd9-434c-b59c-90501be3e1a5-kube-api-access-vlf4f\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.133359 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.133464 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.133512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.133659 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.133548 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13dceeff-6cd9-434c-b59c-90501be3e1a5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.234663 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.234627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.234846 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.234726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.234846 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.234763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.234846 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.234820 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13dceeff-6cd9-434c-b59c-90501be3e1a5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.235029 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.234916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.235029 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.234946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlf4f\" (UniqueName: \"kubernetes.io/projected/13dceeff-6cd9-434c-b59c-90501be3e1a5-kube-api-access-vlf4f\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.235139 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.235076 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.235190 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.235135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.235263 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.235248 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.235338 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.235320 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.237566 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.237538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13dceeff-6cd9-434c-b59c-90501be3e1a5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.242975 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.242952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlf4f\" (UniqueName: \"kubernetes.io/projected/13dceeff-6cd9-434c-b59c-90501be3e1a5-kube-api-access-vlf4f\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.357778 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.357695 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:05:36.506899 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:36.506859 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l"] Apr 17 08:05:36.509163 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:05:36.509128 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13dceeff_6cd9_434c_b59c_90501be3e1a5.slice/crio-6795ddd107e48a554b243e297087569b87839932f0ff9bc26821d77f6d24371a WatchSource:0}: Error finding container 6795ddd107e48a554b243e297087569b87839932f0ff9bc26821d77f6d24371a: Status 404 returned error can't find the container with id 6795ddd107e48a554b243e297087569b87839932f0ff9bc26821d77f6d24371a Apr 17 08:05:37.348315 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:37.348271 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" event={"ID":"13dceeff-6cd9-434c-b59c-90501be3e1a5","Type":"ContainerStarted","Data":"1a4d58173aff7ca8fae4acf7560f7379925672dfc19093b5be6b6d02f648dacc"} Apr 17 08:05:37.348743 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:37.348322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" event={"ID":"13dceeff-6cd9-434c-b59c-90501be3e1a5","Type":"ContainerStarted","Data":"6795ddd107e48a554b243e297087569b87839932f0ff9bc26821d77f6d24371a"} Apr 17 08:05:49.391901 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:49.391861 2573 generic.go:358] "Generic (PLEG): container finished" podID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerID="1a4d58173aff7ca8fae4acf7560f7379925672dfc19093b5be6b6d02f648dacc" exitCode=0 Apr 17 08:05:49.392362 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:49.391907 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" event={"ID":"13dceeff-6cd9-434c-b59c-90501be3e1a5","Type":"ContainerDied","Data":"1a4d58173aff7ca8fae4acf7560f7379925672dfc19093b5be6b6d02f648dacc"} Apr 17 08:05:50.397266 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:50.397227 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" event={"ID":"3d92f950-9568-4765-a41a-5b3d534722af","Type":"ContainerStarted","Data":"96c6ecb9265e733f1791b1cc20d6685692f03f42ee762e719f11aba3b31661a4"} Apr 17 08:05:50.417393 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:50.417238 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podStartSLOduration=10.520384188 podStartE2EDuration="37.417218835s" podCreationTimestamp="2026-04-17 08:05:13 +0000 UTC" firstStartedPulling="2026-04-17 08:05:23.289426994 +0000 UTC m=+807.035483765" lastFinishedPulling="2026-04-17 08:05:50.186261635 +0000 UTC m=+833.932318412" observedRunningTime="2026-04-17 08:05:50.415779765 +0000 UTC m=+834.161836572" watchObservedRunningTime="2026-04-17 08:05:50.417218835 +0000 UTC m=+834.163275642" Apr 17 08:05:51.403462 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:51.403424 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" event={"ID":"13dceeff-6cd9-434c-b59c-90501be3e1a5","Type":"ContainerStarted","Data":"305453b7b2ce3525c14ae0c0411a0051f580aa22b03b2a79159cd8292a6ba77b"} Apr 17 08:05:53.721536 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:53.721498 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:53.722097 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:53.721649 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:05:53.723592 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:05:53.723530 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:06:03.721544 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:03.721494 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:06:08.512903 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:08.512867 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l"] Apr 17 08:06:13.721536 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:13.721483 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:06:23.721568 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:23.721525 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:06:24.537754 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:24.537716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" event={"ID":"13dceeff-6cd9-434c-b59c-90501be3e1a5","Type":"ContainerStarted","Data":"e67013f2fba0576625403450c59f2b835e64549f127d5b8868e7f08e3a987e86"} Apr 17 08:06:24.537976 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:24.537805 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="main" containerID="cri-o://305453b7b2ce3525c14ae0c0411a0051f580aa22b03b2a79159cd8292a6ba77b" gracePeriod=30 Apr 17 08:06:24.537976 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:24.537855 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="tokenizer" containerID="cri-o://e67013f2fba0576625403450c59f2b835e64549f127d5b8868e7f08e3a987e86" gracePeriod=30 Apr 17 08:06:24.537976 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:24.537907 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:06:24.540870 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:24.540838 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 17 08:06:24.558398 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:24.558336 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" podStartSLOduration=14.381035468 podStartE2EDuration="48.558317903s" podCreationTimestamp="2026-04-17 08:05:36 +0000 UTC" firstStartedPulling="2026-04-17 08:05:49.393409181 +0000 UTC m=+833.139465953" lastFinishedPulling="2026-04-17 08:06:23.570691614 +0000 UTC m=+867.316748388" observedRunningTime="2026-04-17 08:06:24.556662659 +0000 UTC m=+868.302719458" watchObservedRunningTime="2026-04-17 08:06:24.558317903 +0000 UTC m=+868.304374698" Apr 17 08:06:25.544454 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:25.544418 2573 generic.go:358] "Generic (PLEG): container finished" podID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerID="305453b7b2ce3525c14ae0c0411a0051f580aa22b03b2a79159cd8292a6ba77b" exitCode=0 Apr 17 08:06:25.544829 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:25.544457 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" event={"ID":"13dceeff-6cd9-434c-b59c-90501be3e1a5","Type":"ContainerDied","Data":"305453b7b2ce3525c14ae0c0411a0051f580aa22b03b2a79159cd8292a6ba77b"} Apr 17 08:06:26.358145 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:26.358109 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:06:33.721783 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:33.721738 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:06:34.539176 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:06:34.539145 2573 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.30:9003", ServerName: "10.133.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.30:9003: connect: connection refused" Apr 17 08:06:35.539742 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:35.539701 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.30:9003\" within 1s: context deadline exceeded" Apr 17 08:06:43.721221 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:43.721131 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:06:44.539354 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:06:44.539324 2573 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.30:9003", ServerName: "10.133.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.30:9003: connect: connection refused" Apr 17 08:06:45.539507 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:45.539461 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.30:9003\" within 1s: context deadline exceeded" Apr 17 08:06:53.720968 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:53.720914 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:06:54.539120 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:06:54.539088 2573 logging.go:55] [core] [Channel #24 SubChannel #25]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.30:9003", ServerName: "10.133.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.30:9003: connect: connection refused" Apr 17 08:06:54.645621 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:54.645590 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l_13dceeff-6cd9-434c-b59c-90501be3e1a5/tokenizer/0.log" Apr 17 08:06:54.646285 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:54.646258 2573 generic.go:358] "Generic (PLEG): container finished" podID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerID="e67013f2fba0576625403450c59f2b835e64549f127d5b8868e7f08e3a987e86" exitCode=137 Apr 17 08:06:54.646414 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:54.646334 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" event={"ID":"13dceeff-6cd9-434c-b59c-90501be3e1a5","Type":"ContainerDied","Data":"e67013f2fba0576625403450c59f2b835e64549f127d5b8868e7f08e3a987e86"} Apr 17 08:06:55.197334 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.197308 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l_13dceeff-6cd9-434c-b59c-90501be3e1a5/tokenizer/0.log" Apr 17 08:06:55.198130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.198110 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:06:55.298507 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.298467 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13dceeff-6cd9-434c-b59c-90501be3e1a5-tls-certs\") pod \"13dceeff-6cd9-434c-b59c-90501be3e1a5\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " Apr 17 08:06:55.298703 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.298528 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-tmp\") pod \"13dceeff-6cd9-434c-b59c-90501be3e1a5\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " Apr 17 08:06:55.298703 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.298609 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-uds\") pod \"13dceeff-6cd9-434c-b59c-90501be3e1a5\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " Apr 17 08:06:55.298703 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.298686 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlf4f\" (UniqueName: \"kubernetes.io/projected/13dceeff-6cd9-434c-b59c-90501be3e1a5-kube-api-access-vlf4f\") pod \"13dceeff-6cd9-434c-b59c-90501be3e1a5\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " Apr 17 08:06:55.298912 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.298716 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-kserve-provision-location\") pod \"13dceeff-6cd9-434c-b59c-90501be3e1a5\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " Apr 17 08:06:55.298912 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.298746 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-cache\") pod \"13dceeff-6cd9-434c-b59c-90501be3e1a5\" (UID: \"13dceeff-6cd9-434c-b59c-90501be3e1a5\") " Apr 17 08:06:55.299028 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.298935 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "13dceeff-6cd9-434c-b59c-90501be3e1a5" (UID: "13dceeff-6cd9-434c-b59c-90501be3e1a5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:55.299153 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.299121 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-uds\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:06:55.299287 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.299154 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "13dceeff-6cd9-434c-b59c-90501be3e1a5" (UID: "13dceeff-6cd9-434c-b59c-90501be3e1a5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:55.299287 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.299164 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "13dceeff-6cd9-434c-b59c-90501be3e1a5" (UID: "13dceeff-6cd9-434c-b59c-90501be3e1a5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:55.299466 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.299442 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "13dceeff-6cd9-434c-b59c-90501be3e1a5" (UID: "13dceeff-6cd9-434c-b59c-90501be3e1a5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:55.300714 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.300688 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dceeff-6cd9-434c-b59c-90501be3e1a5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "13dceeff-6cd9-434c-b59c-90501be3e1a5" (UID: "13dceeff-6cd9-434c-b59c-90501be3e1a5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:55.301274 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.301253 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dceeff-6cd9-434c-b59c-90501be3e1a5-kube-api-access-vlf4f" (OuterVolumeSpecName: "kube-api-access-vlf4f") pod "13dceeff-6cd9-434c-b59c-90501be3e1a5" (UID: "13dceeff-6cd9-434c-b59c-90501be3e1a5"). InnerVolumeSpecName "kube-api-access-vlf4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:55.400129 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.400092 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/13dceeff-6cd9-434c-b59c-90501be3e1a5-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:06:55.400129 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.400124 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-tmp\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:06:55.400377 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.400139 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vlf4f\" (UniqueName: \"kubernetes.io/projected/13dceeff-6cd9-434c-b59c-90501be3e1a5-kube-api-access-vlf4f\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:06:55.400377 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.400153 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:06:55.400377 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.400165 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/13dceeff-6cd9-434c-b59c-90501be3e1a5-tokenizer-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:06:55.539357 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.539296 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.30:9003\" within 1s: context deadline exceeded" Apr 17 08:06:55.650994 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.650921 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l_13dceeff-6cd9-434c-b59c-90501be3e1a5/tokenizer/0.log" Apr 17 08:06:55.651659 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.651632 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" event={"ID":"13dceeff-6cd9-434c-b59c-90501be3e1a5","Type":"ContainerDied","Data":"6795ddd107e48a554b243e297087569b87839932f0ff9bc26821d77f6d24371a"} Apr 17 08:06:55.651775 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.651672 2573 scope.go:117] "RemoveContainer" containerID="e67013f2fba0576625403450c59f2b835e64549f127d5b8868e7f08e3a987e86" Apr 17 08:06:55.651775 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.651681 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l" Apr 17 08:06:55.660500 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.660471 2573 scope.go:117] "RemoveContainer" containerID="305453b7b2ce3525c14ae0c0411a0051f580aa22b03b2a79159cd8292a6ba77b" Apr 17 08:06:55.668506 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.668486 2573 scope.go:117] "RemoveContainer" containerID="1a4d58173aff7ca8fae4acf7560f7379925672dfc19093b5be6b6d02f648dacc" Apr 17 08:06:55.673902 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.673876 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l"] Apr 17 08:06:55.677225 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:55.677208 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c95685bkl6l"] Apr 17 08:06:56.744150 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:56.744048 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:06:56.744468 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:56.744359 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:06:56.802410 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:06:56.802369 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" path="/var/lib/kubelet/pods/13dceeff-6cd9-434c-b59c-90501be3e1a5/volumes" Apr 17 08:07:03.721171 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:03.721127 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:07:13.721017 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:13.720972 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:07:15.502948 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.502916 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp"] Apr 17 08:07:15.503361 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.503285 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="main" Apr 17 08:07:15.503361 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.503299 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="main" Apr 17 08:07:15.503361 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.503310 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="tokenizer" Apr 17 08:07:15.503361 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.503315 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="tokenizer" Apr 17 08:07:15.503361 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.503333 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="storage-initializer" Apr 17 08:07:15.503361 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.503339 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="storage-initializer" Apr 17 08:07:15.503545 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.503391 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="main" Apr 17 08:07:15.503545 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.503402 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="13dceeff-6cd9-434c-b59c-90501be3e1a5" containerName="tokenizer" Apr 17 08:07:15.505432 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.505414 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.507954 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.507930 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 17 08:07:15.516568 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.516546 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp"] Apr 17 08:07:15.572880 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.572850 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-home\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.573026 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.572891 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz2f5\" (UniqueName: \"kubernetes.io/projected/d769fafa-93f2-4059-93a3-d3344517c7c2-kube-api-access-fz2f5\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.573026 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.572968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-model-cache\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.573026 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.573016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-dshm\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.573136 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.573076 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d769fafa-93f2-4059-93a3-d3344517c7c2-tls-certs\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.573136 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.573102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-kserve-provision-location\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.674479 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.674447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-dshm\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.674661 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.674506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d769fafa-93f2-4059-93a3-d3344517c7c2-tls-certs\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.674661 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.674537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-kserve-provision-location\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.674775 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.674671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-home\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.674775 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.674728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fz2f5\" (UniqueName: \"kubernetes.io/projected/d769fafa-93f2-4059-93a3-d3344517c7c2-kube-api-access-fz2f5\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.674896 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.674812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-model-cache\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.674990 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.674961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-kserve-provision-location\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.675055 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.675011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-home\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.675150 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.675132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-model-cache\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.676821 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.676782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-dshm\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.677087 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.677068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d769fafa-93f2-4059-93a3-d3344517c7c2-tls-certs\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.683714 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.683686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz2f5\" (UniqueName: \"kubernetes.io/projected/d769fafa-93f2-4059-93a3-d3344517c7c2-kube-api-access-fz2f5\") pod \"conv-test-round-trip-kserve-565674756f-mdcbp\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.816191 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.816109 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:15.950469 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:15.950435 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp"] Apr 17 08:07:15.951747 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:07:15.951711 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd769fafa_93f2_4059_93a3_d3344517c7c2.slice/crio-56dd8bddb26dcd1f99d8d9d61f3281b7ecb3e22a7556e22a8b0474ac8cbb027f WatchSource:0}: Error finding container 56dd8bddb26dcd1f99d8d9d61f3281b7ecb3e22a7556e22a8b0474ac8cbb027f: Status 404 returned error can't find the container with id 56dd8bddb26dcd1f99d8d9d61f3281b7ecb3e22a7556e22a8b0474ac8cbb027f Apr 17 08:07:16.726713 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:16.726667 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" event={"ID":"d769fafa-93f2-4059-93a3-d3344517c7c2","Type":"ContainerStarted","Data":"58d437efbc6e6d9c4038f64562f1ce68c61d993d6e89cedb9ac3a1dec086c9e8"} Apr 17 08:07:16.727113 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:16.726723 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" event={"ID":"d769fafa-93f2-4059-93a3-d3344517c7c2","Type":"ContainerStarted","Data":"56dd8bddb26dcd1f99d8d9d61f3281b7ecb3e22a7556e22a8b0474ac8cbb027f"} Apr 17 08:07:20.743765 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:20.743727 2573 generic.go:358] "Generic (PLEG): container finished" podID="d769fafa-93f2-4059-93a3-d3344517c7c2" containerID="58d437efbc6e6d9c4038f64562f1ce68c61d993d6e89cedb9ac3a1dec086c9e8" exitCode=0 Apr 17 08:07:20.744169 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:20.743813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" event={"ID":"d769fafa-93f2-4059-93a3-d3344517c7c2","Type":"ContainerDied","Data":"58d437efbc6e6d9c4038f64562f1ce68c61d993d6e89cedb9ac3a1dec086c9e8"} Apr 17 08:07:21.749507 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:21.749473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" event={"ID":"d769fafa-93f2-4059-93a3-d3344517c7c2","Type":"ContainerStarted","Data":"4ef325b9722cd16243de229d5ee6a86ff7e63bd32525afa4d40966a3a9f02daf"} Apr 17 08:07:21.798298 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:21.797914 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" podStartSLOduration=6.797896578 podStartE2EDuration="6.797896578s" podCreationTimestamp="2026-04-17 08:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:07:21.796517875 +0000 UTC m=+925.542574670" watchObservedRunningTime="2026-04-17 08:07:21.797896578 +0000 UTC m=+925.543953375" Apr 17 08:07:22.001730 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.001644 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm"] Apr 17 08:07:22.006906 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.006885 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.009417 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.009363 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 08:07:22.016861 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.016833 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm"] Apr 17 08:07:22.036701 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.036671 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-dshm\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.036701 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.036704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-model-cache\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.036964 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.036728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-home\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.036964 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.036755 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zlzr\" (UniqueName: \"kubernetes.io/projected/21adde8b-58cd-45e2-b552-84d07627a75b-kube-api-access-8zlzr\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.036964 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.036841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.036964 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.036864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-kserve-provision-location\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.137625 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.137588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-dshm\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.137625 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.137632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-model-cache\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.137906 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.137657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-home\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.137906 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.137697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zlzr\" (UniqueName: \"kubernetes.io/projected/21adde8b-58cd-45e2-b552-84d07627a75b-kube-api-access-8zlzr\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.137906 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.137740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.137906 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.137884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-kserve-provision-location\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.138227 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.138196 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-home\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.138461 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.138437 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-kserve-provision-location\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.138767 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.138745 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-model-cache\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.140440 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.140415 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-dshm\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.140693 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.140670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.146718 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.146690 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zlzr\" (UniqueName: \"kubernetes.io/projected/21adde8b-58cd-45e2-b552-84d07627a75b-kube-api-access-8zlzr\") pod \"stop-feature-test-kserve-859586b86d-vtcjm\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.320213 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.320117 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:22.467391 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.467358 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm"] Apr 17 08:07:22.470943 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:07:22.470905 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21adde8b_58cd_45e2_b552_84d07627a75b.slice/crio-86fc418617cb8dabeca0c30825f25b02a13c77dc96e188e4caff031c3a62d102 WatchSource:0}: Error finding container 86fc418617cb8dabeca0c30825f25b02a13c77dc96e188e4caff031c3a62d102: Status 404 returned error can't find the container with id 86fc418617cb8dabeca0c30825f25b02a13c77dc96e188e4caff031c3a62d102 Apr 17 08:07:22.756085 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.756039 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" event={"ID":"21adde8b-58cd-45e2-b552-84d07627a75b","Type":"ContainerStarted","Data":"96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860"} Apr 17 08:07:22.756085 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:22.756089 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" event={"ID":"21adde8b-58cd-45e2-b552-84d07627a75b","Type":"ContainerStarted","Data":"86fc418617cb8dabeca0c30825f25b02a13c77dc96e188e4caff031c3a62d102"} Apr 17 08:07:23.721182 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:23.721127 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" probeResult="failure" output="Get \"https://10.133.0.29:8000/health\": dial tcp 10.133.0.29:8000: connect: connection refused" Apr 17 08:07:24.778642 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:24.778605 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp"] Apr 17 08:07:24.779576 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:24.779518 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" podUID="d769fafa-93f2-4059-93a3-d3344517c7c2" containerName="main" containerID="cri-o://4ef325b9722cd16243de229d5ee6a86ff7e63bd32525afa4d40966a3a9f02daf" gracePeriod=30 Apr 17 08:07:25.816924 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:25.816877 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:26.779102 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:26.779059 2573 generic.go:358] "Generic (PLEG): container finished" podID="21adde8b-58cd-45e2-b552-84d07627a75b" containerID="96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860" exitCode=0 Apr 17 08:07:26.779267 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:26.779137 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" event={"ID":"21adde8b-58cd-45e2-b552-84d07627a75b","Type":"ContainerDied","Data":"96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860"} Apr 17 08:07:27.784612 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:27.784568 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" event={"ID":"21adde8b-58cd-45e2-b552-84d07627a75b","Type":"ContainerStarted","Data":"745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d"} Apr 17 08:07:27.806541 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:27.806485 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podStartSLOduration=6.806469631 podStartE2EDuration="6.806469631s" podCreationTimestamp="2026-04-17 08:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:07:27.804050664 +0000 UTC m=+931.550107470" watchObservedRunningTime="2026-04-17 08:07:27.806469631 +0000 UTC m=+931.552526425" Apr 17 08:07:32.320968 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:32.320924 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:32.320968 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:32.320975 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:07:32.322436 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:32.322393 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 08:07:33.732155 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:33.732122 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:07:33.741107 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:33.741080 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:07:42.321222 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:42.321172 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 08:07:52.321260 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:52.321209 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 08:07:52.789345 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:52.789309 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz"] Apr 17 08:07:52.789734 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:52.789675 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" containerID="cri-o://96c6ecb9265e733f1791b1cc20d6685692f03f42ee762e719f11aba3b31661a4" gracePeriod=30 Apr 17 08:07:54.887630 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:54.887603 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-565674756f-mdcbp_d769fafa-93f2-4059-93a3-d3344517c7c2/main/0.log" Apr 17 08:07:54.888026 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:54.888003 2573 generic.go:358] "Generic (PLEG): container finished" podID="d769fafa-93f2-4059-93a3-d3344517c7c2" containerID="4ef325b9722cd16243de229d5ee6a86ff7e63bd32525afa4d40966a3a9f02daf" exitCode=137 Apr 17 08:07:54.888110 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:54.888081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" event={"ID":"d769fafa-93f2-4059-93a3-d3344517c7c2","Type":"ContainerDied","Data":"4ef325b9722cd16243de229d5ee6a86ff7e63bd32525afa4d40966a3a9f02daf"} Apr 17 08:07:54.986258 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:54.986237 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-565674756f-mdcbp_d769fafa-93f2-4059-93a3-d3344517c7c2/main/0.log" Apr 17 08:07:54.986601 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:54.986586 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:55.065755 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.065681 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-kserve-provision-location\") pod \"d769fafa-93f2-4059-93a3-d3344517c7c2\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " Apr 17 08:07:55.065755 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.065718 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-home\") pod \"d769fafa-93f2-4059-93a3-d3344517c7c2\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " Apr 17 08:07:55.065972 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.065778 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-model-cache\") pod \"d769fafa-93f2-4059-93a3-d3344517c7c2\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " Apr 17 08:07:55.065972 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.065866 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-dshm\") pod \"d769fafa-93f2-4059-93a3-d3344517c7c2\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " Apr 17 08:07:55.065972 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.065887 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d769fafa-93f2-4059-93a3-d3344517c7c2-tls-certs\") pod \"d769fafa-93f2-4059-93a3-d3344517c7c2\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " Apr 17 08:07:55.065972 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.065923 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz2f5\" (UniqueName: \"kubernetes.io/projected/d769fafa-93f2-4059-93a3-d3344517c7c2-kube-api-access-fz2f5\") pod \"d769fafa-93f2-4059-93a3-d3344517c7c2\" (UID: \"d769fafa-93f2-4059-93a3-d3344517c7c2\") " Apr 17 08:07:55.066175 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.065981 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-home" (OuterVolumeSpecName: "home") pod "d769fafa-93f2-4059-93a3-d3344517c7c2" (UID: "d769fafa-93f2-4059-93a3-d3344517c7c2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:55.066175 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.066109 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-model-cache" (OuterVolumeSpecName: "model-cache") pod "d769fafa-93f2-4059-93a3-d3344517c7c2" (UID: "d769fafa-93f2-4059-93a3-d3344517c7c2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:55.066278 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.066178 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:07:55.066278 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.066195 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:07:55.068192 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.068159 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d769fafa-93f2-4059-93a3-d3344517c7c2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d769fafa-93f2-4059-93a3-d3344517c7c2" (UID: "d769fafa-93f2-4059-93a3-d3344517c7c2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:07:55.068481 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.068452 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-dshm" (OuterVolumeSpecName: "dshm") pod "d769fafa-93f2-4059-93a3-d3344517c7c2" (UID: "d769fafa-93f2-4059-93a3-d3344517c7c2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:55.068569 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.068502 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d769fafa-93f2-4059-93a3-d3344517c7c2-kube-api-access-fz2f5" (OuterVolumeSpecName: "kube-api-access-fz2f5") pod "d769fafa-93f2-4059-93a3-d3344517c7c2" (UID: "d769fafa-93f2-4059-93a3-d3344517c7c2"). InnerVolumeSpecName "kube-api-access-fz2f5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:07:55.119188 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.119142 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d769fafa-93f2-4059-93a3-d3344517c7c2" (UID: "d769fafa-93f2-4059-93a3-d3344517c7c2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:55.167213 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.167175 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:07:55.167213 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.167207 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d769fafa-93f2-4059-93a3-d3344517c7c2-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:07:55.167213 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.167218 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fz2f5\" (UniqueName: \"kubernetes.io/projected/d769fafa-93f2-4059-93a3-d3344517c7c2-kube-api-access-fz2f5\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:07:55.167428 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.167230 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d769fafa-93f2-4059-93a3-d3344517c7c2-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:07:55.892944 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.892913 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-565674756f-mdcbp_d769fafa-93f2-4059-93a3-d3344517c7c2/main/0.log" Apr 17 08:07:55.893320 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.893295 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" event={"ID":"d769fafa-93f2-4059-93a3-d3344517c7c2","Type":"ContainerDied","Data":"56dd8bddb26dcd1f99d8d9d61f3281b7ecb3e22a7556e22a8b0474ac8cbb027f"} Apr 17 08:07:55.893378 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.893339 2573 scope.go:117] "RemoveContainer" containerID="4ef325b9722cd16243de229d5ee6a86ff7e63bd32525afa4d40966a3a9f02daf" Apr 17 08:07:55.893413 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.893340 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp" Apr 17 08:07:55.902152 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.902130 2573 scope.go:117] "RemoveContainer" containerID="58d437efbc6e6d9c4038f64562f1ce68c61d993d6e89cedb9ac3a1dec086c9e8" Apr 17 08:07:55.916190 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.916164 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp"] Apr 17 08:07:55.920231 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:55.920205 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-565674756f-mdcbp"] Apr 17 08:07:56.804176 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:07:56.804146 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d769fafa-93f2-4059-93a3-d3344517c7c2" path="/var/lib/kubelet/pods/d769fafa-93f2-4059-93a3-d3344517c7c2/volumes" Apr 17 08:08:01.884629 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.884591 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75"] Apr 17 08:08:01.885189 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.885128 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d769fafa-93f2-4059-93a3-d3344517c7c2" containerName="storage-initializer" Apr 17 08:08:01.885189 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.885149 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d769fafa-93f2-4059-93a3-d3344517c7c2" containerName="storage-initializer" Apr 17 08:08:01.885304 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.885199 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d769fafa-93f2-4059-93a3-d3344517c7c2" containerName="main" Apr 17 08:08:01.885304 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.885208 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d769fafa-93f2-4059-93a3-d3344517c7c2" containerName="main" Apr 17 08:08:01.885304 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.885287 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d769fafa-93f2-4059-93a3-d3344517c7c2" containerName="main" Apr 17 08:08:01.887555 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.887533 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:01.890305 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.890285 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 17 08:08:01.899385 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.899359 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75"] Apr 17 08:08:01.927272 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.927232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-home\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:01.927432 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.927293 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-model-cache\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:01.927432 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.927351 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwvcs\" (UniqueName: \"kubernetes.io/projected/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kube-api-access-fwvcs\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:01.927432 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.927385 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-tls-certs\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:01.927574 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.927448 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-dshm\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:01.927574 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:01.927522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.028577 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.028537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-dshm\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.028770 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.028599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.028770 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.028665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-home\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.028770 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.028699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-model-cache\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.028770 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.028732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwvcs\" (UniqueName: \"kubernetes.io/projected/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kube-api-access-fwvcs\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.028770 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.028759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-tls-certs\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.029150 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.029111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.029150 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.029128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-model-cache\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.029249 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.029194 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-home\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.031163 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.031142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-tls-certs\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.031262 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.031228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-dshm\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.036768 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.036746 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwvcs\" (UniqueName: \"kubernetes.io/projected/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kube-api-access-fwvcs\") pod \"custom-route-timeout-test-kserve-68b64477d8-x9f75\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.199619 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.199539 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:02.321354 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.321314 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 08:08:02.334657 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.334633 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75"] Apr 17 08:08:02.337548 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:08:02.337518 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd1c1534_eb70_4d3f_a09a_3587c42b4b7f.slice/crio-dafd94ba35522e8453b4e026131f26fec372fdcde320d527ba2743180dd4261c WatchSource:0}: Error finding container dafd94ba35522e8453b4e026131f26fec372fdcde320d527ba2743180dd4261c: Status 404 returned error can't find the container with id dafd94ba35522e8453b4e026131f26fec372fdcde320d527ba2743180dd4261c Apr 17 08:08:02.919164 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.919121 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" event={"ID":"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f","Type":"ContainerStarted","Data":"5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358"} Apr 17 08:08:02.919164 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:02.919162 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" event={"ID":"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f","Type":"ContainerStarted","Data":"dafd94ba35522e8453b4e026131f26fec372fdcde320d527ba2743180dd4261c"} Apr 17 08:08:06.938674 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:06.938637 2573 generic.go:358] "Generic (PLEG): container finished" podID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerID="5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358" exitCode=0 Apr 17 08:08:06.938674 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:06.938682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" event={"ID":"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f","Type":"ContainerDied","Data":"5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358"} Apr 17 08:08:07.945321 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:07.945287 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" event={"ID":"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f","Type":"ContainerStarted","Data":"515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44"} Apr 17 08:08:07.969843 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:07.969749 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podStartSLOduration=6.969729112 podStartE2EDuration="6.969729112s" podCreationTimestamp="2026-04-17 08:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:08:07.966308487 +0000 UTC m=+971.712365304" watchObservedRunningTime="2026-04-17 08:08:07.969729112 +0000 UTC m=+971.715785920" Apr 17 08:08:12.200317 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:12.200224 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:12.200317 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:12.200272 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:08:12.201822 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:12.201771 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 17 08:08:12.320750 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:12.320710 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 08:08:22.200619 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:22.200569 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 17 08:08:22.320805 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:22.320747 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 08:08:23.004205 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.004170 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz_3d92f950-9568-4765-a41a-5b3d534722af/main/0.log" Apr 17 08:08:23.004592 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.004563 2573 generic.go:358] "Generic (PLEG): container finished" podID="3d92f950-9568-4765-a41a-5b3d534722af" containerID="96c6ecb9265e733f1791b1cc20d6685692f03f42ee762e719f11aba3b31661a4" exitCode=137 Apr 17 08:08:23.004700 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.004611 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" event={"ID":"3d92f950-9568-4765-a41a-5b3d534722af","Type":"ContainerDied","Data":"96c6ecb9265e733f1791b1cc20d6685692f03f42ee762e719f11aba3b31661a4"} Apr 17 08:08:23.048851 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.048766 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz_3d92f950-9568-4765-a41a-5b3d534722af/main/0.log" Apr 17 08:08:23.049162 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.049144 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:08:23.237141 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.237106 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwxdh\" (UniqueName: \"kubernetes.io/projected/3d92f950-9568-4765-a41a-5b3d534722af-kube-api-access-vwxdh\") pod \"3d92f950-9568-4765-a41a-5b3d534722af\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " Apr 17 08:08:23.237651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.237157 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-kserve-provision-location\") pod \"3d92f950-9568-4765-a41a-5b3d534722af\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " Apr 17 08:08:23.237651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.237205 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-model-cache\") pod \"3d92f950-9568-4765-a41a-5b3d534722af\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " Apr 17 08:08:23.237651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.237223 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-home\") pod \"3d92f950-9568-4765-a41a-5b3d534722af\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " Apr 17 08:08:23.237651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.237421 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-dshm\") pod \"3d92f950-9568-4765-a41a-5b3d534722af\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " Apr 17 08:08:23.237651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.237486 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d92f950-9568-4765-a41a-5b3d534722af-tls-certs\") pod \"3d92f950-9568-4765-a41a-5b3d534722af\" (UID: \"3d92f950-9568-4765-a41a-5b3d534722af\") " Apr 17 08:08:23.237651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.237526 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-model-cache" (OuterVolumeSpecName: "model-cache") pod "3d92f950-9568-4765-a41a-5b3d534722af" (UID: "3d92f950-9568-4765-a41a-5b3d534722af"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:08:23.238025 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.237889 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:08:23.238355 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.238327 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-home" (OuterVolumeSpecName: "home") pod "3d92f950-9568-4765-a41a-5b3d534722af" (UID: "3d92f950-9568-4765-a41a-5b3d534722af"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:08:23.239647 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.239622 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-dshm" (OuterVolumeSpecName: "dshm") pod "3d92f950-9568-4765-a41a-5b3d534722af" (UID: "3d92f950-9568-4765-a41a-5b3d534722af"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:08:23.239982 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.239953 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d92f950-9568-4765-a41a-5b3d534722af-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3d92f950-9568-4765-a41a-5b3d534722af" (UID: "3d92f950-9568-4765-a41a-5b3d534722af"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:08:23.240089 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.239953 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d92f950-9568-4765-a41a-5b3d534722af-kube-api-access-vwxdh" (OuterVolumeSpecName: "kube-api-access-vwxdh") pod "3d92f950-9568-4765-a41a-5b3d534722af" (UID: "3d92f950-9568-4765-a41a-5b3d534722af"). InnerVolumeSpecName "kube-api-access-vwxdh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:08:23.294701 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.294646 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3d92f950-9568-4765-a41a-5b3d534722af" (UID: "3d92f950-9568-4765-a41a-5b3d534722af"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:08:23.338893 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.338809 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:08:23.338893 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.338838 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:08:23.338893 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.338849 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d92f950-9568-4765-a41a-5b3d534722af-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:08:23.338893 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.338861 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwxdh\" (UniqueName: \"kubernetes.io/projected/3d92f950-9568-4765-a41a-5b3d534722af-kube-api-access-vwxdh\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:08:23.338893 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:23.338874 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d92f950-9568-4765-a41a-5b3d534722af-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:08:24.009758 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:24.009727 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz_3d92f950-9568-4765-a41a-5b3d534722af/main/0.log" Apr 17 08:08:24.010233 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:24.010214 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" Apr 17 08:08:24.010354 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:24.010211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz" event={"ID":"3d92f950-9568-4765-a41a-5b3d534722af","Type":"ContainerDied","Data":"8b2624eed28737b5a55944f1bece3ae7de05763494933548e49754ae1ce6e768"} Apr 17 08:08:24.010413 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:24.010354 2573 scope.go:117] "RemoveContainer" containerID="96c6ecb9265e733f1791b1cc20d6685692f03f42ee762e719f11aba3b31661a4" Apr 17 08:08:24.031069 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:24.031043 2573 scope.go:117] "RemoveContainer" containerID="942ac2154d13ba2b78676e09d19156d30af27877851f51e332de16c85edb9865" Apr 17 08:08:24.034110 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:24.034083 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz"] Apr 17 08:08:24.038434 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:24.038404 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-57bc7f9c98qmlbz"] Apr 17 08:08:24.803480 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:24.803441 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d92f950-9568-4765-a41a-5b3d534722af" path="/var/lib/kubelet/pods/3d92f950-9568-4765-a41a-5b3d534722af/volumes" Apr 17 08:08:32.199989 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:32.199939 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 17 08:08:32.320687 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:32.320641 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 08:08:42.200214 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:42.200165 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 17 08:08:42.321178 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:42.321131 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 08:08:52.200898 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:52.200842 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 17 08:08:52.321605 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:08:52.321558 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" probeResult="failure" output="Get \"https://10.133.0.32:8000/health\": dial tcp 10.133.0.32:8000: connect: connection refused" Apr 17 08:09:02.200699 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:02.200651 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 17 08:09:02.331014 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:02.330979 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:09:02.338838 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:02.338817 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:09:03.182526 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:03.182494 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm"] Apr 17 08:09:03.312233 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:03.312200 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:03.312682 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:03.312276 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs podName:21adde8b-58cd-45e2-b552-84d07627a75b nodeName:}" failed. No retries permitted until 2026-04-17 08:09:03.812260572 +0000 UTC m=+1027.558317344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs") pod "stop-feature-test-kserve-859586b86d-vtcjm" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:03.816979 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:03.816947 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:03.817150 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:03.817022 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs podName:21adde8b-58cd-45e2-b552-84d07627a75b nodeName:}" failed. No retries permitted until 2026-04-17 08:09:04.817006807 +0000 UTC m=+1028.563063579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs") pod "stop-feature-test-kserve-859586b86d-vtcjm" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:04.166922 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:04.166880 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" containerID="cri-o://745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d" gracePeriod=30 Apr 17 08:09:04.824924 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:04.824893 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:04.825290 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:04.824956 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs podName:21adde8b-58cd-45e2-b552-84d07627a75b nodeName:}" failed. No retries permitted until 2026-04-17 08:09:06.824940762 +0000 UTC m=+1030.570997533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs") pod "stop-feature-test-kserve-859586b86d-vtcjm" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:06.843962 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:06.843927 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:06.844348 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:06.844003 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs podName:21adde8b-58cd-45e2-b552-84d07627a75b nodeName:}" failed. No retries permitted until 2026-04-17 08:09:10.843989517 +0000 UTC m=+1034.590046289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs") pod "stop-feature-test-kserve-859586b86d-vtcjm" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:10.873884 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:10.873850 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:10.874278 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:10.873939 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs podName:21adde8b-58cd-45e2-b552-84d07627a75b nodeName:}" failed. No retries permitted until 2026-04-17 08:09:18.873916549 +0000 UTC m=+1042.619973324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs") pod "stop-feature-test-kserve-859586b86d-vtcjm" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:12.200740 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:12.200691 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 17 08:09:18.954253 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:18.954216 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:18.954646 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:18.954289 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs podName:21adde8b-58cd-45e2-b552-84d07627a75b nodeName:}" failed. No retries permitted until 2026-04-17 08:09:34.954269677 +0000 UTC m=+1058.700326452 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs") pod "stop-feature-test-kserve-859586b86d-vtcjm" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 17 08:09:22.200129 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:22.200090 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 17 08:09:32.200771 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:32.200729 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.33:8000/health\": dial tcp 10.133.0.33:8000: connect: connection refused" Apr 17 08:09:34.440998 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.440963 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-859586b86d-vtcjm_21adde8b-58cd-45e2-b552-84d07627a75b/main/0.log" Apr 17 08:09:34.441391 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.441361 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:09:34.597455 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.597424 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-home\") pod \"21adde8b-58cd-45e2-b552-84d07627a75b\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " Apr 17 08:09:34.597455 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.597459 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-kserve-provision-location\") pod \"21adde8b-58cd-45e2-b552-84d07627a75b\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " Apr 17 08:09:34.597683 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.597519 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-model-cache\") pod \"21adde8b-58cd-45e2-b552-84d07627a75b\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " Apr 17 08:09:34.597683 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.597551 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zlzr\" (UniqueName: \"kubernetes.io/projected/21adde8b-58cd-45e2-b552-84d07627a75b-kube-api-access-8zlzr\") pod \"21adde8b-58cd-45e2-b552-84d07627a75b\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " Apr 17 08:09:34.597683 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.597593 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs\") pod \"21adde8b-58cd-45e2-b552-84d07627a75b\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " Apr 17 08:09:34.597683 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.597645 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-dshm\") pod \"21adde8b-58cd-45e2-b552-84d07627a75b\" (UID: \"21adde8b-58cd-45e2-b552-84d07627a75b\") " Apr 17 08:09:34.597929 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.597800 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-model-cache" (OuterVolumeSpecName: "model-cache") pod "21adde8b-58cd-45e2-b552-84d07627a75b" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:09:34.597929 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.597871 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-home" (OuterVolumeSpecName: "home") pod "21adde8b-58cd-45e2-b552-84d07627a75b" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:09:34.598050 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.598002 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:09:34.598050 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.598022 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:09:34.599808 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.599757 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "21adde8b-58cd-45e2-b552-84d07627a75b" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:09:34.599931 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.599812 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21adde8b-58cd-45e2-b552-84d07627a75b-kube-api-access-8zlzr" (OuterVolumeSpecName: "kube-api-access-8zlzr") pod "21adde8b-58cd-45e2-b552-84d07627a75b" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b"). InnerVolumeSpecName "kube-api-access-8zlzr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:09:34.599931 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.599835 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-dshm" (OuterVolumeSpecName: "dshm") pod "21adde8b-58cd-45e2-b552-84d07627a75b" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:09:34.664782 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.664742 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "21adde8b-58cd-45e2-b552-84d07627a75b" (UID: "21adde8b-58cd-45e2-b552-84d07627a75b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:09:34.699359 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.699286 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:09:34.699359 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.699316 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8zlzr\" (UniqueName: \"kubernetes.io/projected/21adde8b-58cd-45e2-b552-84d07627a75b-kube-api-access-8zlzr\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:09:34.699359 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.699332 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21adde8b-58cd-45e2-b552-84d07627a75b-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:09:34.699359 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:34.699344 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21adde8b-58cd-45e2-b552-84d07627a75b-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:09:35.272478 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.272450 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-859586b86d-vtcjm_21adde8b-58cd-45e2-b552-84d07627a75b/main/0.log" Apr 17 08:09:35.272768 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.272740 2573 generic.go:358] "Generic (PLEG): container finished" podID="21adde8b-58cd-45e2-b552-84d07627a75b" containerID="745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d" exitCode=137 Apr 17 08:09:35.272847 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.272827 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" event={"ID":"21adde8b-58cd-45e2-b552-84d07627a75b","Type":"ContainerDied","Data":"745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d"} Apr 17 08:09:35.272892 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.272863 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" Apr 17 08:09:35.272892 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.272877 2573 scope.go:117] "RemoveContainer" containerID="745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d" Apr 17 08:09:35.272982 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.272867 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm" event={"ID":"21adde8b-58cd-45e2-b552-84d07627a75b","Type":"ContainerDied","Data":"86fc418617cb8dabeca0c30825f25b02a13c77dc96e188e4caff031c3a62d102"} Apr 17 08:09:35.292299 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.292282 2573 scope.go:117] "RemoveContainer" containerID="96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860" Apr 17 08:09:35.295772 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.295749 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm"] Apr 17 08:09:35.298583 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.298558 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-vtcjm"] Apr 17 08:09:35.381091 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.380133 2573 scope.go:117] "RemoveContainer" containerID="745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d" Apr 17 08:09:35.381091 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:35.380561 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d\": container with ID starting with 745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d not found: ID does not exist" containerID="745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d" Apr 17 08:09:35.381091 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.380596 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d"} err="failed to get container status \"745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d\": rpc error: code = NotFound desc = could not find container \"745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d\": container with ID starting with 745a013cbfa4baf7f2be8ad53dd091e258628546c030a24773373cb2b652e39d not found: ID does not exist" Apr 17 08:09:35.381091 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.380624 2573 scope.go:117] "RemoveContainer" containerID="96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860" Apr 17 08:09:35.381825 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:09:35.381769 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860\": container with ID starting with 96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860 not found: ID does not exist" containerID="96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860" Apr 17 08:09:35.381935 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:35.381832 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860"} err="failed to get container status \"96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860\": rpc error: code = NotFound desc = could not find container \"96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860\": container with ID starting with 96701e2b1e9a93f6eed78f74206b6dcdcacc3e2b2f562ccfe0b9a81aaec75860 not found: ID does not exist" Apr 17 08:09:36.802859 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:36.802827 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" path="/var/lib/kubelet/pods/21adde8b-58cd-45e2-b552-84d07627a75b/volumes" Apr 17 08:09:39.605678 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.605597 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn"] Apr 17 08:09:39.606127 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606039 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="storage-initializer" Apr 17 08:09:39.606127 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606054 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="storage-initializer" Apr 17 08:09:39.606127 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606086 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" Apr 17 08:09:39.606127 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606096 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" Apr 17 08:09:39.606127 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606109 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" Apr 17 08:09:39.606127 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606118 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" Apr 17 08:09:39.606127 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606130 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="storage-initializer" Apr 17 08:09:39.606387 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606136 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="storage-initializer" Apr 17 08:09:39.606387 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606200 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d92f950-9568-4765-a41a-5b3d534722af" containerName="main" Apr 17 08:09:39.606387 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.606208 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="21adde8b-58cd-45e2-b552-84d07627a75b" containerName="main" Apr 17 08:09:39.611315 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.611295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.613924 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.613907 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 08:09:39.618233 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.618210 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn"] Apr 17 08:09:39.746511 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.746472 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-dshm\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.746673 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.746544 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-model-cache\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.746673 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.746563 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-tls-certs\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.746673 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.746585 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kserve-provision-location\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.746802 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.746702 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-home\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.746802 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.746770 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvp8c\" (UniqueName: \"kubernetes.io/projected/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kube-api-access-gvp8c\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.847474 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.847442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-home\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.847612 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.847488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvp8c\" (UniqueName: \"kubernetes.io/projected/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kube-api-access-gvp8c\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.847612 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.847510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-dshm\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.847612 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.847554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-model-cache\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.847612 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.847578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-tls-certs\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.847612 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.847606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kserve-provision-location\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.847953 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.847932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-home\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.848032 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.848008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-model-cache\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.848079 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.848046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kserve-provision-location\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.849812 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.849763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-dshm\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.849969 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.849952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-tls-certs\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.854540 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.854518 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvp8c\" (UniqueName: \"kubernetes.io/projected/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kube-api-access-gvp8c\") pod \"stop-feature-test-kserve-859586b86d-ns4qn\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:39.922673 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:39.922649 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:40.054507 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:40.054471 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn"] Apr 17 08:09:40.055828 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:09:40.055780 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4327f6_c4a4_404e_b6e6_1448c5d9567f.slice/crio-cf25ee6bb11fab90ecee9a0483d78e81d94929ffc253f3ad75ce0850643c61b4 WatchSource:0}: Error finding container cf25ee6bb11fab90ecee9a0483d78e81d94929ffc253f3ad75ce0850643c61b4: Status 404 returned error can't find the container with id cf25ee6bb11fab90ecee9a0483d78e81d94929ffc253f3ad75ce0850643c61b4 Apr 17 08:09:40.057896 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:40.057878 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:09:40.292552 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:40.292473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" event={"ID":"6b4327f6-c4a4-404e-b6e6-1448c5d9567f","Type":"ContainerStarted","Data":"e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9"} Apr 17 08:09:40.292552 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:40.292511 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" event={"ID":"6b4327f6-c4a4-404e-b6e6-1448c5d9567f","Type":"ContainerStarted","Data":"cf25ee6bb11fab90ecee9a0483d78e81d94929ffc253f3ad75ce0850643c61b4"} Apr 17 08:09:42.210400 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:42.210360 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:09:42.218574 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:42.218542 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:09:44.310203 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:44.310117 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerID="e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9" exitCode=0 Apr 17 08:09:44.310511 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:44.310198 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" event={"ID":"6b4327f6-c4a4-404e-b6e6-1448c5d9567f","Type":"ContainerDied","Data":"e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9"} Apr 17 08:09:45.315194 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:45.315161 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" event={"ID":"6b4327f6-c4a4-404e-b6e6-1448c5d9567f","Type":"ContainerStarted","Data":"708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d"} Apr 17 08:09:45.334924 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:45.334864 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podStartSLOduration=6.334847261 podStartE2EDuration="6.334847261s" podCreationTimestamp="2026-04-17 08:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:09:45.333108026 +0000 UTC m=+1069.079164819" watchObservedRunningTime="2026-04-17 08:09:45.334847261 +0000 UTC m=+1069.080904056" Apr 17 08:09:49.923508 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:49.923469 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:49.923508 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:49.923514 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:09:49.925149 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:49.925120 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:09:58.409217 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:58.409108 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75"] Apr 17 08:09:58.410145 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:58.410105 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" containerID="cri-o://515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44" gracePeriod=30 Apr 17 08:09:59.923568 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:09:59.923524 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:10:09.189384 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.189337 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh"] Apr 17 08:10:09.194334 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.194309 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.196689 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.196669 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 17 08:10:09.204407 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.204385 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh"] Apr 17 08:10:09.302078 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.302043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-dshm\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.302271 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.302085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/caacc9dd-ad02-4113-bd62-6534a73f48d5-tls-certs\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.302271 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.302164 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-model-cache\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.302271 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.302221 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-kserve-provision-location\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.302271 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.302265 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-home\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.302444 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.302310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqprv\" (UniqueName: \"kubernetes.io/projected/caacc9dd-ad02-4113-bd62-6534a73f48d5-kube-api-access-cqprv\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.402749 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.402719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-dshm\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.402749 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.402752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/caacc9dd-ad02-4113-bd62-6534a73f48d5-tls-certs\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.403020 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.402903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-model-cache\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.403020 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.402979 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-kserve-provision-location\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.403122 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.403036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-home\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.403122 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.403093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqprv\" (UniqueName: \"kubernetes.io/projected/caacc9dd-ad02-4113-bd62-6534a73f48d5-kube-api-access-cqprv\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.403280 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.403258 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-model-cache\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.403371 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.403346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-kserve-provision-location\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.403444 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.403422 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-home\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.405065 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.405045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-dshm\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.405317 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.405300 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/caacc9dd-ad02-4113-bd62-6534a73f48d5-tls-certs\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.410639 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.410618 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqprv\" (UniqueName: \"kubernetes.io/projected/caacc9dd-ad02-4113-bd62-6534a73f48d5-kube-api-access-cqprv\") pod \"router-with-refs-test-kserve-6b94f6968-hq7kh\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.507294 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.507189 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:09.638098 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.638030 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh"] Apr 17 08:10:09.640460 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:10:09.640428 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaacc9dd_ad02_4113_bd62_6534a73f48d5.slice/crio-4be9ed574298cbac9b042a3e4cd3b0f9bf1a377ef4f93ffafaae0122c16d2822 WatchSource:0}: Error finding container 4be9ed574298cbac9b042a3e4cd3b0f9bf1a377ef4f93ffafaae0122c16d2822: Status 404 returned error can't find the container with id 4be9ed574298cbac9b042a3e4cd3b0f9bf1a377ef4f93ffafaae0122c16d2822 Apr 17 08:10:09.923187 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:09.923134 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:10:10.406443 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:10.406397 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" event={"ID":"caacc9dd-ad02-4113-bd62-6534a73f48d5","Type":"ContainerStarted","Data":"5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e"} Apr 17 08:10:10.406443 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:10.406439 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" event={"ID":"caacc9dd-ad02-4113-bd62-6534a73f48d5","Type":"ContainerStarted","Data":"4be9ed574298cbac9b042a3e4cd3b0f9bf1a377ef4f93ffafaae0122c16d2822"} Apr 17 08:10:14.421322 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:14.421280 2573 generic.go:358] "Generic (PLEG): container finished" podID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerID="5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e" exitCode=0 Apr 17 08:10:14.421709 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:14.421350 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" event={"ID":"caacc9dd-ad02-4113-bd62-6534a73f48d5","Type":"ContainerDied","Data":"5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e"} Apr 17 08:10:15.426539 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:15.426506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" event={"ID":"caacc9dd-ad02-4113-bd62-6534a73f48d5","Type":"ContainerStarted","Data":"1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a"} Apr 17 08:10:15.448349 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:15.448290 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podStartSLOduration=6.448271135 podStartE2EDuration="6.448271135s" podCreationTimestamp="2026-04-17 08:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:10:15.445978634 +0000 UTC m=+1099.192035426" watchObservedRunningTime="2026-04-17 08:10:15.448271135 +0000 UTC m=+1099.194327930" Apr 17 08:10:19.507879 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:19.507832 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:19.507879 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:19.507883 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:10:19.509138 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:19.509106 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:10:19.923390 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:19.923341 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:10:28.711055 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.711023 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-68b64477d8-x9f75_bd1c1534-eb70-4d3f-a09a-3587c42b4b7f/main/0.log" Apr 17 08:10:28.711478 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.711462 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:10:28.887026 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.886991 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-model-cache\") pod \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " Apr 17 08:10:28.887219 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.887036 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kserve-provision-location\") pod \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " Apr 17 08:10:28.887219 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.887111 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-tls-certs\") pod \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " Apr 17 08:10:28.887219 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.887136 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-home\") pod \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " Apr 17 08:10:28.887219 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.887159 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwvcs\" (UniqueName: \"kubernetes.io/projected/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kube-api-access-fwvcs\") pod \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " Apr 17 08:10:28.887219 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.887191 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-dshm\") pod \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\" (UID: \"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f\") " Apr 17 08:10:28.887545 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.887298 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-model-cache" (OuterVolumeSpecName: "model-cache") pod "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" (UID: "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:10:28.887608 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.887545 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:10:28.887608 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.887564 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-home" (OuterVolumeSpecName: "home") pod "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" (UID: "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:10:28.890003 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.889968 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kube-api-access-fwvcs" (OuterVolumeSpecName: "kube-api-access-fwvcs") pod "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" (UID: "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f"). InnerVolumeSpecName "kube-api-access-fwvcs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:10:28.890003 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.889991 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" (UID: "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:10:28.890194 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.890130 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-dshm" (OuterVolumeSpecName: "dshm") pod "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" (UID: "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:10:28.949321 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.949224 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" (UID: "bd1c1534-eb70-4d3f-a09a-3587c42b4b7f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:10:28.988835 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.988777 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:10:28.988835 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.988832 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:10:28.989033 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.988845 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:10:28.989033 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.988860 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fwvcs\" (UniqueName: \"kubernetes.io/projected/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-kube-api-access-fwvcs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:10:28.989033 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:28.988874 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:10:29.483560 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.483533 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-68b64477d8-x9f75_bd1c1534-eb70-4d3f-a09a-3587c42b4b7f/main/0.log" Apr 17 08:10:29.483904 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.483879 2573 generic.go:358] "Generic (PLEG): container finished" podID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerID="515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44" exitCode=137 Apr 17 08:10:29.484018 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.483954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" event={"ID":"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f","Type":"ContainerDied","Data":"515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44"} Apr 17 08:10:29.484018 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.483970 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" Apr 17 08:10:29.484018 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.483993 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75" event={"ID":"bd1c1534-eb70-4d3f-a09a-3587c42b4b7f","Type":"ContainerDied","Data":"dafd94ba35522e8453b4e026131f26fec372fdcde320d527ba2743180dd4261c"} Apr 17 08:10:29.484018 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.484009 2573 scope.go:117] "RemoveContainer" containerID="515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44" Apr 17 08:10:29.505099 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.505078 2573 scope.go:117] "RemoveContainer" containerID="5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358" Apr 17 08:10:29.507658 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.507629 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:10:29.510657 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.510627 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75"] Apr 17 08:10:29.515073 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.515047 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-68b64477d8-x9f75"] Apr 17 08:10:29.574177 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.574146 2573 scope.go:117] "RemoveContainer" containerID="515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44" Apr 17 08:10:29.574606 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:10:29.574570 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44\": container with ID starting with 515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44 not found: ID does not exist" containerID="515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44" Apr 17 08:10:29.574761 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.574616 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44"} err="failed to get container status \"515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44\": rpc error: code = NotFound desc = could not find container \"515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44\": container with ID starting with 515fde16e28723c5e958be328506d298adf218f0ffc192c30311f7aedb4aaf44 not found: ID does not exist" Apr 17 08:10:29.574761 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.574646 2573 scope.go:117] "RemoveContainer" containerID="5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358" Apr 17 08:10:29.575053 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:10:29.575014 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358\": container with ID starting with 5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358 not found: ID does not exist" containerID="5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358" Apr 17 08:10:29.575138 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.575048 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358"} err="failed to get container status \"5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358\": rpc error: code = NotFound desc = could not find container \"5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358\": container with ID starting with 5bda6d994806939e8fa03ecd0a3c6ca26743394d391816d1b83d6629d15ec358 not found: ID does not exist" Apr 17 08:10:29.923273 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:29.923218 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:10:30.812559 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:30.812520 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" path="/var/lib/kubelet/pods/bd1c1534-eb70-4d3f-a09a-3587c42b4b7f/volumes" Apr 17 08:10:39.508212 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:39.508169 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:10:39.923432 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:39.923396 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:10:49.508224 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:49.508174 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:10:49.924100 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:49.924053 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:10:59.507731 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:59.507682 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:10:59.923748 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:10:59.923691 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:11:09.508056 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:09.507951 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:11:09.923327 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:09.923287 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:11:19.507992 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:19.507945 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:11:19.923528 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:19.923492 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" probeResult="failure" output="Get \"https://10.133.0.34:8000/health\": dial tcp 10.133.0.34:8000: connect: connection refused" Apr 17 08:11:29.507901 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:29.507852 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:11:29.939000 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:29.938960 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:11:29.946909 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:29.946882 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:11:31.217474 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:31.217435 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn"] Apr 17 08:11:31.700733 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:31.700689 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" containerID="cri-o://708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d" gracePeriod=30 Apr 17 08:11:39.508197 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:39.508149 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:11:49.508570 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:49.508514 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" probeResult="failure" output="Get \"https://10.133.0.35:8000/health\": dial tcp 10.133.0.35:8000: connect: connection refused" Apr 17 08:11:56.774121 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:56.774088 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:11:56.781570 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:56.781542 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:11:59.517813 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:59.517757 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:11:59.525489 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:11:59.525461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:12:01.988220 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:01.988184 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-859586b86d-ns4qn_6b4327f6-c4a4-404e-b6e6-1448c5d9567f/main/0.log" Apr 17 08:12:01.988640 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:01.988624 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:12:02.031864 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.031830 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-dshm\") pod \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " Apr 17 08:12:02.031864 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.031872 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvp8c\" (UniqueName: \"kubernetes.io/projected/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kube-api-access-gvp8c\") pod \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " Apr 17 08:12:02.032110 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.031923 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-tls-certs\") pod \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " Apr 17 08:12:02.032110 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.031939 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-home\") pod \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " Apr 17 08:12:02.032110 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.031960 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kserve-provision-location\") pod \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " Apr 17 08:12:02.032297 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.032274 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-model-cache\") pod \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\" (UID: \"6b4327f6-c4a4-404e-b6e6-1448c5d9567f\") " Apr 17 08:12:02.032610 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.032582 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-home" (OuterVolumeSpecName: "home") pod "6b4327f6-c4a4-404e-b6e6-1448c5d9567f" (UID: "6b4327f6-c4a4-404e-b6e6-1448c5d9567f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:12:02.032709 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.032579 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-model-cache" (OuterVolumeSpecName: "model-cache") pod "6b4327f6-c4a4-404e-b6e6-1448c5d9567f" (UID: "6b4327f6-c4a4-404e-b6e6-1448c5d9567f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:12:02.034305 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.034277 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-dshm" (OuterVolumeSpecName: "dshm") pod "6b4327f6-c4a4-404e-b6e6-1448c5d9567f" (UID: "6b4327f6-c4a4-404e-b6e6-1448c5d9567f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:12:02.034567 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.034531 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kube-api-access-gvp8c" (OuterVolumeSpecName: "kube-api-access-gvp8c") pod "6b4327f6-c4a4-404e-b6e6-1448c5d9567f" (UID: "6b4327f6-c4a4-404e-b6e6-1448c5d9567f"). InnerVolumeSpecName "kube-api-access-gvp8c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:12:02.034848 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.034820 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6b4327f6-c4a4-404e-b6e6-1448c5d9567f" (UID: "6b4327f6-c4a4-404e-b6e6-1448c5d9567f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:12:02.090871 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.090817 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6b4327f6-c4a4-404e-b6e6-1448c5d9567f" (UID: "6b4327f6-c4a4-404e-b6e6-1448c5d9567f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:12:02.133896 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.133856 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:02.133896 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.133886 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:02.133896 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.133896 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:02.133896 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.133905 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gvp8c\" (UniqueName: \"kubernetes.io/projected/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-kube-api-access-gvp8c\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:02.134245 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.133940 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:02.134245 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.133951 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b4327f6-c4a4-404e-b6e6-1448c5d9567f-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:02.804740 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.804706 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-859586b86d-ns4qn_6b4327f6-c4a4-404e-b6e6-1448c5d9567f/main/0.log" Apr 17 08:12:02.805071 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.805045 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerID="708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d" exitCode=137 Apr 17 08:12:02.805161 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.805128 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" event={"ID":"6b4327f6-c4a4-404e-b6e6-1448c5d9567f","Type":"ContainerDied","Data":"708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d"} Apr 17 08:12:02.805225 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.805167 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" event={"ID":"6b4327f6-c4a4-404e-b6e6-1448c5d9567f","Type":"ContainerDied","Data":"cf25ee6bb11fab90ecee9a0483d78e81d94929ffc253f3ad75ce0850643c61b4"} Apr 17 08:12:02.805225 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.805183 2573 scope.go:117] "RemoveContainer" containerID="708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d" Apr 17 08:12:02.805225 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.805136 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn" Apr 17 08:12:02.825461 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.825441 2573 scope.go:117] "RemoveContainer" containerID="e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9" Apr 17 08:12:02.830550 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.830516 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn"] Apr 17 08:12:02.833871 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.833845 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-859586b86d-ns4qn"] Apr 17 08:12:02.836323 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.836305 2573 scope.go:117] "RemoveContainer" containerID="708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d" Apr 17 08:12:02.836614 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:12:02.836583 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d\": container with ID starting with 708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d not found: ID does not exist" containerID="708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d" Apr 17 08:12:02.836702 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.836626 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d"} err="failed to get container status \"708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d\": rpc error: code = NotFound desc = could not find container \"708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d\": container with ID starting with 708ef7d9ed145b1979f65c558c65f0a986038c3aba8f506a6efeea3220e07d6d not found: ID does not exist" Apr 17 08:12:02.836702 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.836654 2573 scope.go:117] "RemoveContainer" containerID="e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9" Apr 17 08:12:02.836931 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:12:02.836913 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9\": container with ID starting with e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9 not found: ID does not exist" containerID="e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9" Apr 17 08:12:02.836986 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:02.836939 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9"} err="failed to get container status \"e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9\": rpc error: code = NotFound desc = could not find container \"e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9\": container with ID starting with e003441538358747e07d1e7d3abef8d99367cb0fedfcbb5d57312963e6bf55a9 not found: ID does not exist" Apr 17 08:12:04.803152 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:04.803117 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" path="/var/lib/kubelet/pods/6b4327f6-c4a4-404e-b6e6-1448c5d9567f/volumes" Apr 17 08:12:05.513007 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:05.512955 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh"] Apr 17 08:12:05.513267 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:05.513244 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" containerID="cri-o://1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a" gracePeriod=30 Apr 17 08:12:35.756123 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.756099 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-6b94f6968-hq7kh_caacc9dd-ad02-4113-bd62-6534a73f48d5/main/0.log" Apr 17 08:12:35.756490 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.756422 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:12:35.919453 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.919427 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-6b94f6968-hq7kh_caacc9dd-ad02-4113-bd62-6534a73f48d5/main/0.log" Apr 17 08:12:35.919775 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.919751 2573 generic.go:358] "Generic (PLEG): container finished" podID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerID="1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a" exitCode=137 Apr 17 08:12:35.919880 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.919851 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" event={"ID":"caacc9dd-ad02-4113-bd62-6534a73f48d5","Type":"ContainerDied","Data":"1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a"} Apr 17 08:12:35.919922 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.919902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" event={"ID":"caacc9dd-ad02-4113-bd62-6534a73f48d5","Type":"ContainerDied","Data":"4be9ed574298cbac9b042a3e4cd3b0f9bf1a377ef4f93ffafaae0122c16d2822"} Apr 17 08:12:35.919922 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.919919 2573 scope.go:117] "RemoveContainer" containerID="1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a" Apr 17 08:12:35.919981 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.919860 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh" Apr 17 08:12:35.929162 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.929143 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/caacc9dd-ad02-4113-bd62-6534a73f48d5-tls-certs\") pod \"caacc9dd-ad02-4113-bd62-6534a73f48d5\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " Apr 17 08:12:35.929275 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.929173 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-kserve-provision-location\") pod \"caacc9dd-ad02-4113-bd62-6534a73f48d5\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " Apr 17 08:12:35.929275 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.929198 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqprv\" (UniqueName: \"kubernetes.io/projected/caacc9dd-ad02-4113-bd62-6534a73f48d5-kube-api-access-cqprv\") pod \"caacc9dd-ad02-4113-bd62-6534a73f48d5\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " Apr 17 08:12:35.929275 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.929227 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-dshm\") pod \"caacc9dd-ad02-4113-bd62-6534a73f48d5\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " Apr 17 08:12:35.929430 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.929341 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-model-cache\") pod \"caacc9dd-ad02-4113-bd62-6534a73f48d5\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " Apr 17 08:12:35.929430 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.929425 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-home\") pod \"caacc9dd-ad02-4113-bd62-6534a73f48d5\" (UID: \"caacc9dd-ad02-4113-bd62-6534a73f48d5\") " Apr 17 08:12:35.929858 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.929602 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-model-cache" (OuterVolumeSpecName: "model-cache") pod "caacc9dd-ad02-4113-bd62-6534a73f48d5" (UID: "caacc9dd-ad02-4113-bd62-6534a73f48d5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:12:35.929858 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.929828 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:35.930007 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.929983 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-home" (OuterVolumeSpecName: "home") pod "caacc9dd-ad02-4113-bd62-6534a73f48d5" (UID: "caacc9dd-ad02-4113-bd62-6534a73f48d5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:12:35.931698 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.931674 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-dshm" (OuterVolumeSpecName: "dshm") pod "caacc9dd-ad02-4113-bd62-6534a73f48d5" (UID: "caacc9dd-ad02-4113-bd62-6534a73f48d5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:12:35.931698 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.931689 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caacc9dd-ad02-4113-bd62-6534a73f48d5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "caacc9dd-ad02-4113-bd62-6534a73f48d5" (UID: "caacc9dd-ad02-4113-bd62-6534a73f48d5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:12:35.931849 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.931707 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caacc9dd-ad02-4113-bd62-6534a73f48d5-kube-api-access-cqprv" (OuterVolumeSpecName: "kube-api-access-cqprv") pod "caacc9dd-ad02-4113-bd62-6534a73f48d5" (UID: "caacc9dd-ad02-4113-bd62-6534a73f48d5"). InnerVolumeSpecName "kube-api-access-cqprv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:12:35.945115 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.945097 2573 scope.go:117] "RemoveContainer" containerID="5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e" Apr 17 08:12:35.997569 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:35.997526 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "caacc9dd-ad02-4113-bd62-6534a73f48d5" (UID: "caacc9dd-ad02-4113-bd62-6534a73f48d5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:12:36.012344 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.012327 2573 scope.go:117] "RemoveContainer" containerID="1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a" Apr 17 08:12:36.012665 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:12:36.012645 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a\": container with ID starting with 1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a not found: ID does not exist" containerID="1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a" Apr 17 08:12:36.012725 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.012674 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a"} err="failed to get container status \"1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a\": rpc error: code = NotFound desc = could not find container \"1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a\": container with ID starting with 1cabbf96d2e6382d111edf4814ff8f747250fbec95cd5948b316633556ee913a not found: ID does not exist" Apr 17 08:12:36.012725 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.012695 2573 scope.go:117] "RemoveContainer" containerID="5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e" Apr 17 08:12:36.012940 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:12:36.012927 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e\": container with ID starting with 5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e not found: ID does not exist" containerID="5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e" Apr 17 08:12:36.012984 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.012942 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e"} err="failed to get container status \"5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e\": rpc error: code = NotFound desc = could not find container \"5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e\": container with ID starting with 5f0977b1389e0945215605f795bc97f47d14699645e63cb6dc5a6b693b0e933e not found: ID does not exist" Apr 17 08:12:36.031021 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.031001 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/caacc9dd-ad02-4113-bd62-6534a73f48d5-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:36.031021 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.031019 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:36.031142 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.031029 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqprv\" (UniqueName: \"kubernetes.io/projected/caacc9dd-ad02-4113-bd62-6534a73f48d5-kube-api-access-cqprv\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:36.031142 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.031038 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:36.031142 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.031046 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/caacc9dd-ad02-4113-bd62-6534a73f48d5-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:12:36.243068 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.243037 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh"] Apr 17 08:12:36.247740 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.247716 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-6b94f6968-hq7kh"] Apr 17 08:12:36.804265 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:36.804231 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" path="/var/lib/kubelet/pods/caacc9dd-ad02-4113-bd62-6534a73f48d5/volumes" Apr 17 08:12:43.117684 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.117607 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl"] Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.117946 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.117957 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.117966 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="storage-initializer" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.117971 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="storage-initializer" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.117981 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="storage-initializer" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.117987 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="storage-initializer" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.117999 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="storage-initializer" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.118004 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="storage-initializer" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.118010 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.118015 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.118021 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.118026 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.118077 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd1c1534-eb70-4d3f-a09a-3587c42b4b7f" containerName="main" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.118085 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="caacc9dd-ad02-4113-bd62-6534a73f48d5" containerName="main" Apr 17 08:12:43.118130 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.118093 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b4327f6-c4a4-404e-b6e6-1448c5d9567f" containerName="main" Apr 17 08:12:43.122306 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.122287 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.125062 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.125040 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 17 08:12:43.125189 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.125091 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 08:12:43.125189 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.125123 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:12:43.125830 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.125810 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-2c724\"" Apr 17 08:12:43.125949 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.125832 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:12:43.131181 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.131153 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl"] Apr 17 08:12:43.137752 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.137731 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j"] Apr 17 08:12:43.140505 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.140486 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.153288 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.153265 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j"] Apr 17 08:12:43.290700 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.290670 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.290876 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.290718 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.290876 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.290751 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.290876 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.290818 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.290876 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.290837 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.290876 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.290854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.290876 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.290869 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgt8b\" (UniqueName: \"kubernetes.io/projected/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kube-api-access-sgt8b\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.291139 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.290896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e5717f-cc15-46bf-ad46-18e2bee0f699-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.291139 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.290970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.291139 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.291040 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.291139 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.291071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.291139 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.291126 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxgnf\" (UniqueName: \"kubernetes.io/projected/38e5717f-cc15-46bf-ad46-18e2bee0f699-kube-api-access-rxgnf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.391784 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.391755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.391784 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.391804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.392007 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.391854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgt8b\" (UniqueName: \"kubernetes.io/projected/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kube-api-access-sgt8b\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.392007 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.391895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e5717f-cc15-46bf-ad46-18e2bee0f699-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.392007 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.391927 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.392007 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.391953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.392007 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.391969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.392007 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.391995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxgnf\" (UniqueName: \"kubernetes.io/projected/38e5717f-cc15-46bf-ad46-18e2bee0f699-kube-api-access-rxgnf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.392303 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.392303 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392236 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.392303 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392238 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.392436 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392296 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.392436 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.392436 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392372 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.392436 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.392639 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392509 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.392698 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392654 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.392752 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.392717 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.394273 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.394250 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.394429 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.394408 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.394495 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.394469 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e5717f-cc15-46bf-ad46-18e2bee0f699-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.394853 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.394832 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.400546 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.400516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxgnf\" (UniqueName: \"kubernetes.io/projected/38e5717f-cc15-46bf-ad46-18e2bee0f699-kube-api-access-rxgnf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.400634 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.400525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgt8b\" (UniqueName: \"kubernetes.io/projected/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kube-api-access-sgt8b\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.432534 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.432513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:43.453446 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.453418 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:43.584397 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.584348 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl"] Apr 17 08:12:43.586966 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:12:43.586931 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fcaffb1_59dc_4fc4_8326_c45b11f3b0b9.slice/crio-e142ad36e33bc4d5995308f1e1b628542e2c43d13fafedb66ac34acabb182ebf WatchSource:0}: Error finding container e142ad36e33bc4d5995308f1e1b628542e2c43d13fafedb66ac34acabb182ebf: Status 404 returned error can't find the container with id e142ad36e33bc4d5995308f1e1b628542e2c43d13fafedb66ac34acabb182ebf Apr 17 08:12:43.603931 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.603911 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j"] Apr 17 08:12:43.605120 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:12:43.605098 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38e5717f_cc15_46bf_ad46_18e2bee0f699.slice/crio-ef2f98b5c325cf1373f9a3b8f851e5e820798275ec80321ca345cb6944e071b4 WatchSource:0}: Error finding container ef2f98b5c325cf1373f9a3b8f851e5e820798275ec80321ca345cb6944e071b4: Status 404 returned error can't find the container with id ef2f98b5c325cf1373f9a3b8f851e5e820798275ec80321ca345cb6944e071b4 Apr 17 08:12:43.947567 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.947479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" event={"ID":"38e5717f-cc15-46bf-ad46-18e2bee0f699","Type":"ContainerStarted","Data":"bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e"} Apr 17 08:12:43.947567 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.947525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" event={"ID":"38e5717f-cc15-46bf-ad46-18e2bee0f699","Type":"ContainerStarted","Data":"ef2f98b5c325cf1373f9a3b8f851e5e820798275ec80321ca345cb6944e071b4"} Apr 17 08:12:43.948671 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:43.948646 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" event={"ID":"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9","Type":"ContainerStarted","Data":"e142ad36e33bc4d5995308f1e1b628542e2c43d13fafedb66ac34acabb182ebf"} Apr 17 08:12:44.954925 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:44.954887 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" event={"ID":"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9","Type":"ContainerStarted","Data":"f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493"} Apr 17 08:12:45.962199 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:45.962158 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" event={"ID":"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9","Type":"ContainerStarted","Data":"9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552"} Apr 17 08:12:45.962598 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:45.962256 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:47.972727 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:47.972692 2573 generic.go:358] "Generic (PLEG): container finished" podID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerID="bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e" exitCode=0 Apr 17 08:12:47.973131 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:47.972781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" event={"ID":"38e5717f-cc15-46bf-ad46-18e2bee0f699","Type":"ContainerDied","Data":"bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e"} Apr 17 08:12:48.978246 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:48.978206 2573 generic.go:358] "Generic (PLEG): container finished" podID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerID="9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552" exitCode=0 Apr 17 08:12:48.978643 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:48.978278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" event={"ID":"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9","Type":"ContainerDied","Data":"9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552"} Apr 17 08:12:48.980084 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:48.980055 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" event={"ID":"38e5717f-cc15-46bf-ad46-18e2bee0f699","Type":"ContainerStarted","Data":"c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d"} Apr 17 08:12:49.020866 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:49.020782 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podStartSLOduration=6.020765992 podStartE2EDuration="6.020765992s" podCreationTimestamp="2026-04-17 08:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:12:49.018740851 +0000 UTC m=+1252.764797644" watchObservedRunningTime="2026-04-17 08:12:49.020765992 +0000 UTC m=+1252.766822812" Apr 17 08:12:49.986325 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:49.986286 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" event={"ID":"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9","Type":"ContainerStarted","Data":"58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626"} Apr 17 08:12:50.008554 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:50.008496 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podStartSLOduration=6.263281582 podStartE2EDuration="7.008478988s" podCreationTimestamp="2026-04-17 08:12:43 +0000 UTC" firstStartedPulling="2026-04-17 08:12:43.58897948 +0000 UTC m=+1247.335036251" lastFinishedPulling="2026-04-17 08:12:44.334176878 +0000 UTC m=+1248.080233657" observedRunningTime="2026-04-17 08:12:50.0061811 +0000 UTC m=+1253.752237895" watchObservedRunningTime="2026-04-17 08:12:50.008478988 +0000 UTC m=+1253.754535793" Apr 17 08:12:53.432999 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:53.432952 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:53.432999 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:53.433008 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:12:53.434465 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:53.434431 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:12:53.453684 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:53.453646 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:53.453822 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:53.453701 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:12:53.455078 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:53.455043 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:12:58.495848 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.495811 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc"] Apr 17 08:12:58.524326 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.524288 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc"] Apr 17 08:12:58.524513 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.524348 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.527146 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.527122 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 17 08:12:58.634555 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.634504 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.634555 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.634556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e58e5196-3f81-4118-86b8-5ea4eff204f0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.634827 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.634584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.634827 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.634622 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44hwc\" (UniqueName: \"kubernetes.io/projected/e58e5196-3f81-4118-86b8-5ea4eff204f0-kube-api-access-44hwc\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.634827 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.634644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.634827 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.634709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.735212 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.735169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.735402 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.735272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.735402 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.735335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.735402 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.735362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e58e5196-3f81-4118-86b8-5ea4eff204f0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.735564 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.735417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.735564 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.735470 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44hwc\" (UniqueName: \"kubernetes.io/projected/e58e5196-3f81-4118-86b8-5ea4eff204f0-kube-api-access-44hwc\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.735724 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.735696 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.735812 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.735730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.735877 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.735813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.737547 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.737516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.738042 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.738023 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e58e5196-3f81-4118-86b8-5ea4eff204f0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.743856 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.743806 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44hwc\" (UniqueName: \"kubernetes.io/projected/e58e5196-3f81-4118-86b8-5ea4eff204f0-kube-api-access-44hwc\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.835576 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.835494 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:12:58.977353 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:58.977294 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc"] Apr 17 08:12:58.983638 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:12:58.983594 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58e5196_3f81_4118_86b8_5ea4eff204f0.slice/crio-d4fd3682e8ec12f87ab7b73a6190cb2a7e6289a5dc427fa4ff500f45447cc83a WatchSource:0}: Error finding container d4fd3682e8ec12f87ab7b73a6190cb2a7e6289a5dc427fa4ff500f45447cc83a: Status 404 returned error can't find the container with id d4fd3682e8ec12f87ab7b73a6190cb2a7e6289a5dc427fa4ff500f45447cc83a Apr 17 08:12:59.018619 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:12:59.018576 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" event={"ID":"e58e5196-3f81-4118-86b8-5ea4eff204f0","Type":"ContainerStarted","Data":"d4fd3682e8ec12f87ab7b73a6190cb2a7e6289a5dc427fa4ff500f45447cc83a"} Apr 17 08:13:00.024066 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:00.024027 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" event={"ID":"e58e5196-3f81-4118-86b8-5ea4eff204f0","Type":"ContainerStarted","Data":"7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe"} Apr 17 08:13:03.434225 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:03.433739 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:13:03.448140 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:03.448111 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:13:03.453961 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:03.453879 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:13:04.041845 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:04.041806 2573 generic.go:358] "Generic (PLEG): container finished" podID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerID="7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe" exitCode=0 Apr 17 08:13:04.041845 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:04.041821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" event={"ID":"e58e5196-3f81-4118-86b8-5ea4eff204f0","Type":"ContainerDied","Data":"7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe"} Apr 17 08:13:05.047768 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:05.047736 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" event={"ID":"e58e5196-3f81-4118-86b8-5ea4eff204f0","Type":"ContainerStarted","Data":"a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed"} Apr 17 08:13:05.074107 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:05.071620 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podStartSLOduration=7.071599644 podStartE2EDuration="7.071599644s" podCreationTimestamp="2026-04-17 08:12:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:13:05.068057798 +0000 UTC m=+1268.814114592" watchObservedRunningTime="2026-04-17 08:13:05.071599644 +0000 UTC m=+1268.817656438" Apr 17 08:13:08.836250 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:08.836205 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:13:08.836250 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:08.836257 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:13:08.838304 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:08.838268 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:13:13.433137 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:13.433084 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:13:13.454665 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:13.454624 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:13:18.836373 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:18.836325 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:13:23.433150 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:23.433094 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:13:23.454162 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:23.454114 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:13:28.836661 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:28.836601 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:13:33.433753 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:33.433705 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:13:33.454562 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:33.454515 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:13:38.836663 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:38.836608 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:13:43.433509 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:43.433453 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:13:43.454352 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:43.454305 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:13:48.836468 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:48.836418 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:13:53.433696 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:53.433635 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:13:53.454846 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:53.454800 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:13:58.836198 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:13:58.836157 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:14:03.433946 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:03.433897 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:14:03.454426 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:03.454383 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:14:08.836877 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:08.836834 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:14:13.433760 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:13.433712 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:14:13.454142 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:13.454095 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:14:18.836925 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:18.836875 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:14:23.433701 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:23.433649 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:14:23.454758 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:23.454717 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:14:28.836080 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:28.836037 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:14:33.433635 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:33.433583 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:14:33.454276 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:33.454231 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:14:38.836609 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:38.836569 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:14:43.433646 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:43.433598 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:14:43.454069 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:43.454028 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:14:48.836208 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:48.836161 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:14:53.433289 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:53.433239 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:14:53.454038 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:53.453998 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:14:58.836257 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:14:58.836200 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:15:03.433407 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:03.433341 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:15:03.453977 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:03.453925 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:15:08.835938 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:08.835890 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:15:13.433689 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:13.433637 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:15:13.454278 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:13.454233 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:15:18.841836 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:18.841764 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:15:23.433732 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:23.433675 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:15:23.454148 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:23.454104 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:15:28.836903 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:28.836857 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:15:33.433282 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:33.433235 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.36:8001/health\": dial tcp 10.133.0.36:8001: connect: connection refused" Apr 17 08:15:33.454063 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:33.454023 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" probeResult="failure" output="Get \"https://10.133.0.37:8000/health\": dial tcp 10.133.0.37:8000: connect: connection refused" Apr 17 08:15:38.836998 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:38.836933 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.38:8000/health\": dial tcp 10.133.0.38:8000: connect: connection refused" Apr 17 08:15:43.442960 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:43.442875 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:15:43.455907 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:43.455875 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:15:43.463820 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:43.463776 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:15:43.472454 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:43.472406 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:15:48.854068 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:48.854029 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:15:49.674325 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:49.674289 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:15:57.096116 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:57.096081 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc"] Apr 17 08:15:57.096755 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:15:57.096366 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" containerID="cri-o://a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed" gracePeriod=30 Apr 17 08:16:07.266300 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.266262 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 08:16:07.270087 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.270062 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.272803 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.272761 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 17 08:16:07.273851 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.273823 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-nxt4g\"" Apr 17 08:16:07.284261 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.284237 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 08:16:07.348597 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.348566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.348597 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.348603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.348818 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.348624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.348818 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.348666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2cee61-5c50-4be7-b2a6-a92410798b4c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.348818 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.348691 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.348818 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.348774 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp54d\" (UniqueName: \"kubernetes.io/projected/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kube-api-access-qp54d\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.449345 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.449294 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp54d\" (UniqueName: \"kubernetes.io/projected/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kube-api-access-qp54d\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.449557 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.449400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.449557 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.449425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.449557 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.449444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.449557 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.449463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2cee61-5c50-4be7-b2a6-a92410798b4c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.449557 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.449488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.449874 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.449849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.449934 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.449896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.449989 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.449900 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.451817 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.451765 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.451949 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.451930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2cee61-5c50-4be7-b2a6-a92410798b4c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.457963 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.457928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp54d\" (UniqueName: \"kubernetes.io/projected/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kube-api-access-qp54d\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.582193 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.582112 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:07.708333 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.708304 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 08:16:07.710295 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:16:07.710268 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce2cee61_5c50_4be7_b2a6_a92410798b4c.slice/crio-dcd10bd32d8a2296d94562d670c7cd4bc9b566cd44dcb2e7f5ce6fd47fdc402f WatchSource:0}: Error finding container dcd10bd32d8a2296d94562d670c7cd4bc9b566cd44dcb2e7f5ce6fd47fdc402f: Status 404 returned error can't find the container with id dcd10bd32d8a2296d94562d670c7cd4bc9b566cd44dcb2e7f5ce6fd47fdc402f Apr 17 08:16:07.712284 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.712263 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:16:07.733765 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:07.733743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"ce2cee61-5c50-4be7-b2a6-a92410798b4c","Type":"ContainerStarted","Data":"dcd10bd32d8a2296d94562d670c7cd4bc9b566cd44dcb2e7f5ce6fd47fdc402f"} Apr 17 08:16:08.738869 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:08.738828 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"ce2cee61-5c50-4be7-b2a6-a92410798b4c","Type":"ContainerStarted","Data":"3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc"} Apr 17 08:16:09.408038 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:09.408006 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j"] Apr 17 08:16:09.408310 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:09.408279 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" containerID="cri-o://c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d" gracePeriod=30 Apr 17 08:16:09.414962 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:09.414932 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl"] Apr 17 08:16:09.415378 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:09.415344 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" containerID="cri-o://58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626" gracePeriod=30 Apr 17 08:16:12.758170 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:12.758126 2573 generic.go:358] "Generic (PLEG): container finished" podID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerID="3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc" exitCode=0 Apr 17 08:16:12.758558 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:12.758194 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"ce2cee61-5c50-4be7-b2a6-a92410798b4c","Type":"ContainerDied","Data":"3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc"} Apr 17 08:16:13.763480 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:13.763445 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"ce2cee61-5c50-4be7-b2a6-a92410798b4c","Type":"ContainerStarted","Data":"aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87"} Apr 17 08:16:13.783554 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:13.783501 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.7834868539999995 podStartE2EDuration="6.783486854s" podCreationTimestamp="2026-04-17 08:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:16:13.782279016 +0000 UTC m=+1457.528335812" watchObservedRunningTime="2026-04-17 08:16:13.783486854 +0000 UTC m=+1457.529543639" Apr 17 08:16:17.582258 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:17.582221 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:17.583856 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:17.583817 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:16:27.367725 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.367701 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc_e58e5196-3f81-4118-86b8-5ea4eff204f0/main/0.log" Apr 17 08:16:27.368106 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.368089 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:16:27.431349 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.431319 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-model-cache\") pod \"e58e5196-3f81-4118-86b8-5ea4eff204f0\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " Apr 17 08:16:27.431506 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.431376 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-kserve-provision-location\") pod \"e58e5196-3f81-4118-86b8-5ea4eff204f0\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " Apr 17 08:16:27.431506 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.431457 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e58e5196-3f81-4118-86b8-5ea4eff204f0-tls-certs\") pod \"e58e5196-3f81-4118-86b8-5ea4eff204f0\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " Apr 17 08:16:27.431506 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.431487 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44hwc\" (UniqueName: \"kubernetes.io/projected/e58e5196-3f81-4118-86b8-5ea4eff204f0-kube-api-access-44hwc\") pod \"e58e5196-3f81-4118-86b8-5ea4eff204f0\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " Apr 17 08:16:27.431660 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.431521 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-home\") pod \"e58e5196-3f81-4118-86b8-5ea4eff204f0\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " Apr 17 08:16:27.431660 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.431584 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-dshm\") pod \"e58e5196-3f81-4118-86b8-5ea4eff204f0\" (UID: \"e58e5196-3f81-4118-86b8-5ea4eff204f0\") " Apr 17 08:16:27.431660 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.431605 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-model-cache" (OuterVolumeSpecName: "model-cache") pod "e58e5196-3f81-4118-86b8-5ea4eff204f0" (UID: "e58e5196-3f81-4118-86b8-5ea4eff204f0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:27.431881 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.431861 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:27.432024 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.431995 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-home" (OuterVolumeSpecName: "home") pod "e58e5196-3f81-4118-86b8-5ea4eff204f0" (UID: "e58e5196-3f81-4118-86b8-5ea4eff204f0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:27.439252 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.439214 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e58e5196-3f81-4118-86b8-5ea4eff204f0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e58e5196-3f81-4118-86b8-5ea4eff204f0" (UID: "e58e5196-3f81-4118-86b8-5ea4eff204f0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:16:27.439558 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.439529 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-dshm" (OuterVolumeSpecName: "dshm") pod "e58e5196-3f81-4118-86b8-5ea4eff204f0" (UID: "e58e5196-3f81-4118-86b8-5ea4eff204f0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:27.439653 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.439594 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58e5196-3f81-4118-86b8-5ea4eff204f0-kube-api-access-44hwc" (OuterVolumeSpecName: "kube-api-access-44hwc") pod "e58e5196-3f81-4118-86b8-5ea4eff204f0" (UID: "e58e5196-3f81-4118-86b8-5ea4eff204f0"). InnerVolumeSpecName "kube-api-access-44hwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:16:27.469375 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.469340 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e58e5196-3f81-4118-86b8-5ea4eff204f0" (UID: "e58e5196-3f81-4118-86b8-5ea4eff204f0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:27.532857 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.532829 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:27.532857 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.532855 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:27.533011 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.532866 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e58e5196-3f81-4118-86b8-5ea4eff204f0-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:27.533011 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.532876 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-44hwc\" (UniqueName: \"kubernetes.io/projected/e58e5196-3f81-4118-86b8-5ea4eff204f0-kube-api-access-44hwc\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:27.533011 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.532885 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e58e5196-3f81-4118-86b8-5ea4eff204f0-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:27.583117 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.583076 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:16:27.815890 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.815866 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc_e58e5196-3f81-4118-86b8-5ea4eff204f0/main/0.log" Apr 17 08:16:27.816223 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.816197 2573 generic.go:358] "Generic (PLEG): container finished" podID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerID="a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed" exitCode=137 Apr 17 08:16:27.816310 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.816262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" event={"ID":"e58e5196-3f81-4118-86b8-5ea4eff204f0","Type":"ContainerDied","Data":"a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed"} Apr 17 08:16:27.816310 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.816287 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" event={"ID":"e58e5196-3f81-4118-86b8-5ea4eff204f0","Type":"ContainerDied","Data":"d4fd3682e8ec12f87ab7b73a6190cb2a7e6289a5dc427fa4ff500f45447cc83a"} Apr 17 08:16:27.816310 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.816288 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc" Apr 17 08:16:27.816310 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.816303 2573 scope.go:117] "RemoveContainer" containerID="a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed" Apr 17 08:16:27.840543 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.840514 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc"] Apr 17 08:16:27.842868 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.842680 2573 scope.go:117] "RemoveContainer" containerID="7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe" Apr 17 08:16:27.847565 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.847528 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-556f686b94q8jrc"] Apr 17 08:16:27.884665 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.884646 2573 scope.go:117] "RemoveContainer" containerID="a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed" Apr 17 08:16:27.885020 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:16:27.884998 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed\": container with ID starting with a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed not found: ID does not exist" containerID="a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed" Apr 17 08:16:27.885106 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.885027 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed"} err="failed to get container status \"a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed\": rpc error: code = NotFound desc = could not find container \"a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed\": container with ID starting with a9f4c97896568418142d5922218884b21379a2bf074900c7d9004faa9498c6ed not found: ID does not exist" Apr 17 08:16:27.885106 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.885047 2573 scope.go:117] "RemoveContainer" containerID="7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe" Apr 17 08:16:27.885288 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:16:27.885266 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe\": container with ID starting with 7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe not found: ID does not exist" containerID="7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe" Apr 17 08:16:27.885328 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:27.885295 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe"} err="failed to get container status \"7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe\": rpc error: code = NotFound desc = could not find container \"7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe\": container with ID starting with 7a49f8fbc0a480dcbddf6461ad0cfd49b5c869e0b86f1c95e228a1970954b9fe not found: ID does not exist" Apr 17 08:16:28.803081 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:28.803043 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" path="/var/lib/kubelet/pods/e58e5196-3f81-4118-86b8-5ea4eff204f0/volumes" Apr 17 08:16:35.401162 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.401123 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm"] Apr 17 08:16:35.401552 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.401529 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="storage-initializer" Apr 17 08:16:35.401552 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.401543 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="storage-initializer" Apr 17 08:16:35.401682 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.401556 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" Apr 17 08:16:35.401682 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.401562 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" Apr 17 08:16:35.401682 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.401635 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e58e5196-3f81-4118-86b8-5ea4eff204f0" containerName="main" Apr 17 08:16:35.406983 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.406963 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.409273 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.409250 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-mng64\"" Apr 17 08:16:35.409400 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.409292 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 17 08:16:35.417518 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.417487 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm"] Apr 17 08:16:35.426655 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.426630 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc"] Apr 17 08:16:35.430115 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.430094 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.441859 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.441830 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc"] Apr 17 08:16:35.505033 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505000 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-home\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.505199 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djf4x\" (UniqueName: \"kubernetes.io/projected/deabb10c-4e65-4256-b8ac-da6a0c980429-kube-api-access-djf4x\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.505199 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.505199 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.505199 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505163 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-dshm\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.505337 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.505337 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505321 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7bdx\" (UniqueName: \"kubernetes.io/projected/2dd20ad5-d04a-424d-8da8-24c269e59eac-kube-api-access-v7bdx\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.505403 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505351 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/deabb10c-4e65-4256-b8ac-da6a0c980429-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.505403 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.505470 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.505506 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505487 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dd20ad5-d04a-424d-8da8-24c269e59eac-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.505544 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.505512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.606505 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dd20ad5-d04a-424d-8da8-24c269e59eac-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.606505 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.606771 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-home\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.606771 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djf4x\" (UniqueName: \"kubernetes.io/projected/deabb10c-4e65-4256-b8ac-da6a0c980429-kube-api-access-djf4x\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.606771 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.606771 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.606771 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606681 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-dshm\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.606771 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.607086 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7bdx\" (UniqueName: \"kubernetes.io/projected/2dd20ad5-d04a-424d-8da8-24c269e59eac-kube-api-access-v7bdx\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.607086 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/deabb10c-4e65-4256-b8ac-da6a0c980429-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.607086 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.607086 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.606894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.607086 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.607044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.607086 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.607054 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-home\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.607393 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.607235 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.607393 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.607312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.607505 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.607397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.607505 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.607492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.609376 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.609340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dd20ad5-d04a-424d-8da8-24c269e59eac-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.609553 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.609531 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.609630 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.609610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-dshm\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.609670 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.609643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/deabb10c-4e65-4256-b8ac-da6a0c980429-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.617188 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.617164 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7bdx\" (UniqueName: \"kubernetes.io/projected/2dd20ad5-d04a-424d-8da8-24c269e59eac-kube-api-access-v7bdx\") pod \"custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.626271 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.626247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djf4x\" (UniqueName: \"kubernetes.io/projected/deabb10c-4e65-4256-b8ac-da6a0c980429-kube-api-access-djf4x\") pod \"custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.718403 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.718304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:35.744253 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.744222 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:35.874557 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.874518 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm"] Apr 17 08:16:35.878338 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:16:35.878299 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd20ad5_d04a_424d_8da8_24c269e59eac.slice/crio-6dee9cc20f74d3cd22b226f0e745a4f8a04ca174a70f533ff61bbcc4d9f3f1f2 WatchSource:0}: Error finding container 6dee9cc20f74d3cd22b226f0e745a4f8a04ca174a70f533ff61bbcc4d9f3f1f2: Status 404 returned error can't find the container with id 6dee9cc20f74d3cd22b226f0e745a4f8a04ca174a70f533ff61bbcc4d9f3f1f2 Apr 17 08:16:35.896968 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:35.896942 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc"] Apr 17 08:16:35.899452 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:16:35.899416 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeabb10c_4e65_4256_b8ac_da6a0c980429.slice/crio-514a0d848960c5a4bb43254a7914d4f42ec4ef7ebfdb17446ae2399852f52e65 WatchSource:0}: Error finding container 514a0d848960c5a4bb43254a7914d4f42ec4ef7ebfdb17446ae2399852f52e65: Status 404 returned error can't find the container with id 514a0d848960c5a4bb43254a7914d4f42ec4ef7ebfdb17446ae2399852f52e65 Apr 17 08:16:36.850865 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:36.850828 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" event={"ID":"2dd20ad5-d04a-424d-8da8-24c269e59eac","Type":"ContainerStarted","Data":"bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378"} Apr 17 08:16:36.850865 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:36.850875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" event={"ID":"2dd20ad5-d04a-424d-8da8-24c269e59eac","Type":"ContainerStarted","Data":"6dee9cc20f74d3cd22b226f0e745a4f8a04ca174a70f533ff61bbcc4d9f3f1f2"} Apr 17 08:16:36.851503 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:36.850907 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:36.852970 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:36.852941 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" event={"ID":"deabb10c-4e65-4256-b8ac-da6a0c980429","Type":"ContainerStarted","Data":"bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264"} Apr 17 08:16:36.853072 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:36.852977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" event={"ID":"deabb10c-4e65-4256-b8ac-da6a0c980429","Type":"ContainerStarted","Data":"514a0d848960c5a4bb43254a7914d4f42ec4ef7ebfdb17446ae2399852f52e65"} Apr 17 08:16:37.583241 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:37.583182 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:16:37.583487 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:37.583456 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:16:37.858231 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:37.858127 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" event={"ID":"2dd20ad5-d04a-424d-8da8-24c269e59eac","Type":"ContainerStarted","Data":"e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b"} Apr 17 08:16:39.416488 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.416401 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="llm-d-routing-sidecar" containerID="cri-o://f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493" gracePeriod=2 Apr 17 08:16:39.729979 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.729524 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:16:39.855120 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.855092 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-dshm\") pod \"38e5717f-cc15-46bf-ad46-18e2bee0f699\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " Apr 17 08:16:39.855284 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.855143 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-home\") pod \"38e5717f-cc15-46bf-ad46-18e2bee0f699\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " Apr 17 08:16:39.855284 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.855101 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl_1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9/main/0.log" Apr 17 08:16:39.855284 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.855221 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-model-cache\") pod \"38e5717f-cc15-46bf-ad46-18e2bee0f699\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " Apr 17 08:16:39.855284 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.855260 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-kserve-provision-location\") pod \"38e5717f-cc15-46bf-ad46-18e2bee0f699\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " Apr 17 08:16:39.855528 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.855299 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e5717f-cc15-46bf-ad46-18e2bee0f699-tls-certs\") pod \"38e5717f-cc15-46bf-ad46-18e2bee0f699\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " Apr 17 08:16:39.855528 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.855351 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxgnf\" (UniqueName: \"kubernetes.io/projected/38e5717f-cc15-46bf-ad46-18e2bee0f699-kube-api-access-rxgnf\") pod \"38e5717f-cc15-46bf-ad46-18e2bee0f699\" (UID: \"38e5717f-cc15-46bf-ad46-18e2bee0f699\") " Apr 17 08:16:39.855834 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.855762 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-home" (OuterVolumeSpecName: "home") pod "38e5717f-cc15-46bf-ad46-18e2bee0f699" (UID: "38e5717f-cc15-46bf-ad46-18e2bee0f699"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:39.858564 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.858537 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-dshm" (OuterVolumeSpecName: "dshm") pod "38e5717f-cc15-46bf-ad46-18e2bee0f699" (UID: "38e5717f-cc15-46bf-ad46-18e2bee0f699"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:39.858564 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.858548 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e5717f-cc15-46bf-ad46-18e2bee0f699-kube-api-access-rxgnf" (OuterVolumeSpecName: "kube-api-access-rxgnf") pod "38e5717f-cc15-46bf-ad46-18e2bee0f699" (UID: "38e5717f-cc15-46bf-ad46-18e2bee0f699"). InnerVolumeSpecName "kube-api-access-rxgnf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:16:39.858910 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.858555 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e5717f-cc15-46bf-ad46-18e2bee0f699-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "38e5717f-cc15-46bf-ad46-18e2bee0f699" (UID: "38e5717f-cc15-46bf-ad46-18e2bee0f699"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:16:39.858910 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.858644 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-model-cache" (OuterVolumeSpecName: "model-cache") pod "38e5717f-cc15-46bf-ad46-18e2bee0f699" (UID: "38e5717f-cc15-46bf-ad46-18e2bee0f699"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:39.859027 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.859008 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:16:39.868584 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.868554 2573 generic.go:358] "Generic (PLEG): container finished" podID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerID="c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d" exitCode=137 Apr 17 08:16:39.868725 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.868633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" event={"ID":"38e5717f-cc15-46bf-ad46-18e2bee0f699","Type":"ContainerDied","Data":"c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d"} Apr 17 08:16:39.868725 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.868639 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" Apr 17 08:16:39.868725 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.868671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j" event={"ID":"38e5717f-cc15-46bf-ad46-18e2bee0f699","Type":"ContainerDied","Data":"ef2f98b5c325cf1373f9a3b8f851e5e820798275ec80321ca345cb6944e071b4"} Apr 17 08:16:39.868725 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.868691 2573 scope.go:117] "RemoveContainer" containerID="c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d" Apr 17 08:16:39.870381 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.870363 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl_1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9/main/0.log" Apr 17 08:16:39.871297 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.871179 2573 generic.go:358] "Generic (PLEG): container finished" podID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerID="58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626" exitCode=137 Apr 17 08:16:39.871297 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.871199 2573 generic.go:358] "Generic (PLEG): container finished" podID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerID="f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493" exitCode=0 Apr 17 08:16:39.871297 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.871231 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" event={"ID":"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9","Type":"ContainerDied","Data":"58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626"} Apr 17 08:16:39.871297 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.871256 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" event={"ID":"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9","Type":"ContainerDied","Data":"f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493"} Apr 17 08:16:39.871297 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.871268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" event={"ID":"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9","Type":"ContainerDied","Data":"e142ad36e33bc4d5995308f1e1b628542e2c43d13fafedb66ac34acabb182ebf"} Apr 17 08:16:39.871527 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.871305 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl" Apr 17 08:16:39.898106 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.897945 2573 scope.go:117] "RemoveContainer" containerID="bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e" Apr 17 08:16:39.955984 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.955889 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "38e5717f-cc15-46bf-ad46-18e2bee0f699" (UID: "38e5717f-cc15-46bf-ad46-18e2bee0f699"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:39.956244 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.956220 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-tls-certs\") pod \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " Apr 17 08:16:39.956350 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.956284 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-home\") pod \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " Apr 17 08:16:39.956434 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.956410 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgt8b\" (UniqueName: \"kubernetes.io/projected/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kube-api-access-sgt8b\") pod \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " Apr 17 08:16:39.956544 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.956495 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kserve-provision-location\") pod \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " Apr 17 08:16:39.956610 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.956543 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-model-cache\") pod \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " Apr 17 08:16:39.956671 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.956612 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-dshm\") pod \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\" (UID: \"1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9\") " Apr 17 08:16:39.957243 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.956758 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-home" (OuterVolumeSpecName: "home") pod "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" (UID: "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:39.957243 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.957099 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:39.957243 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.957119 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:39.957243 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.957133 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:39.957243 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.957152 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e5717f-cc15-46bf-ad46-18e2bee0f699-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:39.957243 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.957168 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e5717f-cc15-46bf-ad46-18e2bee0f699-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:39.957243 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.957184 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rxgnf\" (UniqueName: \"kubernetes.io/projected/38e5717f-cc15-46bf-ad46-18e2bee0f699-kube-api-access-rxgnf\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:39.957243 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.957197 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:39.957243 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.957096 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-model-cache" (OuterVolumeSpecName: "model-cache") pod "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" (UID: "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:39.959876 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.959844 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kube-api-access-sgt8b" (OuterVolumeSpecName: "kube-api-access-sgt8b") pod "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" (UID: "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9"). InnerVolumeSpecName "kube-api-access-sgt8b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:16:39.959992 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.959710 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" (UID: "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:16:39.960188 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.960165 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-dshm" (OuterVolumeSpecName: "dshm") pod "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" (UID: "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:39.967577 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.967538 2573 scope.go:117] "RemoveContainer" containerID="c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d" Apr 17 08:16:39.968045 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:16:39.968019 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d\": container with ID starting with c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d not found: ID does not exist" containerID="c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d" Apr 17 08:16:39.968121 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.968053 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d"} err="failed to get container status \"c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d\": rpc error: code = NotFound desc = could not find container \"c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d\": container with ID starting with c6a8eb0ab162cb8c5c585cfec1a8c4297980a0fd077c92c47dd549f6a41c247d not found: ID does not exist" Apr 17 08:16:39.968121 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.968074 2573 scope.go:117] "RemoveContainer" containerID="bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e" Apr 17 08:16:39.968392 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:16:39.968370 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e\": container with ID starting with bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e not found: ID does not exist" containerID="bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e" Apr 17 08:16:39.968460 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.968397 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e"} err="failed to get container status \"bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e\": rpc error: code = NotFound desc = could not find container \"bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e\": container with ID starting with bb55a399472264cabc98becaa84f4aca5d1e40192b954d268f4bf675f258979e not found: ID does not exist" Apr 17 08:16:39.968460 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.968417 2573 scope.go:117] "RemoveContainer" containerID="58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626" Apr 17 08:16:39.989941 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:39.989920 2573 scope.go:117] "RemoveContainer" containerID="9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552" Apr 17 08:16:40.032239 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.032171 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" (UID: "1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:40.051863 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.051832 2573 scope.go:117] "RemoveContainer" containerID="f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493" Apr 17 08:16:40.058566 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.058520 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:40.058566 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.058545 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:40.058566 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.058560 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:40.058566 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.058571 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:40.058989 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.058580 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sgt8b\" (UniqueName: \"kubernetes.io/projected/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9-kube-api-access-sgt8b\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:16:40.059968 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.059939 2573 scope.go:117] "RemoveContainer" containerID="58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626" Apr 17 08:16:40.060284 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:16:40.060263 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626\": container with ID starting with 58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626 not found: ID does not exist" containerID="58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626" Apr 17 08:16:40.060371 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.060298 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626"} err="failed to get container status \"58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626\": rpc error: code = NotFound desc = could not find container \"58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626\": container with ID starting with 58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626 not found: ID does not exist" Apr 17 08:16:40.060371 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.060320 2573 scope.go:117] "RemoveContainer" containerID="9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552" Apr 17 08:16:40.060644 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:16:40.060612 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552\": container with ID starting with 9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552 not found: ID does not exist" containerID="9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552" Apr 17 08:16:40.060727 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.060652 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552"} err="failed to get container status \"9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552\": rpc error: code = NotFound desc = could not find container \"9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552\": container with ID starting with 9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552 not found: ID does not exist" Apr 17 08:16:40.060727 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.060676 2573 scope.go:117] "RemoveContainer" containerID="f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493" Apr 17 08:16:40.060959 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:16:40.060935 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493\": container with ID starting with f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493 not found: ID does not exist" containerID="f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493" Apr 17 08:16:40.061042 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.060964 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493"} err="failed to get container status \"f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493\": rpc error: code = NotFound desc = could not find container \"f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493\": container with ID starting with f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493 not found: ID does not exist" Apr 17 08:16:40.061042 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.060979 2573 scope.go:117] "RemoveContainer" containerID="58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626" Apr 17 08:16:40.061228 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.061204 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626"} err="failed to get container status \"58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626\": rpc error: code = NotFound desc = could not find container \"58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626\": container with ID starting with 58110c836f8639e5daaa734570ca80695ea9132125d36e9d4518d6ea66653626 not found: ID does not exist" Apr 17 08:16:40.061228 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.061227 2573 scope.go:117] "RemoveContainer" containerID="9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552" Apr 17 08:16:40.061462 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.061438 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552"} err="failed to get container status \"9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552\": rpc error: code = NotFound desc = could not find container \"9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552\": container with ID starting with 9c5abfcab54439779c38993687582807fca714330f737e6017710df10a430552 not found: ID does not exist" Apr 17 08:16:40.061462 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.061460 2573 scope.go:117] "RemoveContainer" containerID="f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493" Apr 17 08:16:40.061706 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.061687 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493"} err="failed to get container status \"f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493\": rpc error: code = NotFound desc = could not find container \"f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493\": container with ID starting with f1e21347fdc85b2c662dafac367d7dcef6d4a222cb273a713f7fa6aa724f8493 not found: ID does not exist" Apr 17 08:16:40.197034 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.197001 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl"] Apr 17 08:16:40.202124 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.202089 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-9b7b4dbbf-dqqjl"] Apr 17 08:16:40.212181 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.212105 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j"] Apr 17 08:16:40.216706 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.216681 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vtv4j"] Apr 17 08:16:40.804782 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.804737 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" path="/var/lib/kubelet/pods/1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9/volumes" Apr 17 08:16:40.805722 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.805696 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" path="/var/lib/kubelet/pods/38e5717f-cc15-46bf-ad46-18e2bee0f699/volumes" Apr 17 08:16:40.877707 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.877670 2573 generic.go:358] "Generic (PLEG): container finished" podID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerID="bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264" exitCode=0 Apr 17 08:16:40.877926 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:40.877726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" event={"ID":"deabb10c-4e65-4256-b8ac-da6a0c980429","Type":"ContainerDied","Data":"bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264"} Apr 17 08:16:41.883835 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:41.883780 2573 generic.go:358] "Generic (PLEG): container finished" podID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerID="e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b" exitCode=0 Apr 17 08:16:41.883835 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:41.883823 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" event={"ID":"2dd20ad5-d04a-424d-8da8-24c269e59eac","Type":"ContainerDied","Data":"e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b"} Apr 17 08:16:41.885885 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:41.885860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" event={"ID":"deabb10c-4e65-4256-b8ac-da6a0c980429","Type":"ContainerStarted","Data":"92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01"} Apr 17 08:16:41.922588 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:41.922529 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podStartSLOduration=6.922508697 podStartE2EDuration="6.922508697s" podCreationTimestamp="2026-04-17 08:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:16:41.920733302 +0000 UTC m=+1485.666790112" watchObservedRunningTime="2026-04-17 08:16:41.922508697 +0000 UTC m=+1485.668565492" Apr 17 08:16:42.892674 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:42.892638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" event={"ID":"2dd20ad5-d04a-424d-8da8-24c269e59eac","Type":"ContainerStarted","Data":"2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94"} Apr 17 08:16:42.916028 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:42.915978 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podStartSLOduration=7.915960567 podStartE2EDuration="7.915960567s" podCreationTimestamp="2026-04-17 08:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:16:42.91390079 +0000 UTC m=+1486.659957585" watchObservedRunningTime="2026-04-17 08:16:42.915960567 +0000 UTC m=+1486.662017362" Apr 17 08:16:45.719097 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:45.719058 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:45.719097 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:45.719107 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:45.720621 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:45.720486 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:16:45.740693 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:45.740663 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:16:45.745025 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:45.745004 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:45.745151 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:45.745038 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:16:45.746878 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:45.746843 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:16:47.582776 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:47.582720 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:16:55.719156 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:55.719105 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:16:55.745624 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:55.745584 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:16:56.810707 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:56.810644 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:16:56.812503 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:56.812476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:16:57.583128 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:16:57.583079 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:17:05.719338 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:05.719282 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:17:05.745189 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:05.745109 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:17:07.583233 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:07.583183 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:17:15.719428 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:15.719364 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:17:15.745132 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:15.745088 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:17:17.583661 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:17.583612 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:17:25.719733 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:25.719679 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:17:25.745214 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:25.745166 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:17:27.583524 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:27.583477 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:17:35.719784 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:35.719733 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:17:35.745169 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:35.745128 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:17:37.583178 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:37.583134 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:17:45.719702 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:45.719646 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:17:45.744680 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:45.744645 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:17:47.582799 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:47.582751 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:17:55.719532 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:55.719481 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:17:55.744804 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:55.744747 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:17:57.583247 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:17:57.583191 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:18:05.719113 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:05.719071 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:18:05.745623 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:05.745573 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:18:07.582923 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:07.582879 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:18:15.719164 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:15.719114 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:18:15.744762 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:15.744712 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:18:17.583641 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:17.583584 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.39:8000/health\": dial tcp 10.133.0.39:8000: connect: connection refused" Apr 17 08:18:25.718754 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:25.718693 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:18:25.744916 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:25.744867 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:18:27.592455 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:27.592423 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:18:27.600087 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:27.600060 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:18:35.719806 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:35.719733 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:18:35.745311 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:35.745267 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:18:42.957161 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:42.957098 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 08:18:42.957627 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:42.957465 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" containerID="cri-o://aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87" gracePeriod=30 Apr 17 08:18:44.219862 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.219837 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:18:44.335102 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.335006 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp54d\" (UniqueName: \"kubernetes.io/projected/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kube-api-access-qp54d\") pod \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " Apr 17 08:18:44.335102 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.335066 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-home\") pod \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " Apr 17 08:18:44.335338 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.335132 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-model-cache\") pod \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " Apr 17 08:18:44.335338 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.335156 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kserve-provision-location\") pod \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " Apr 17 08:18:44.335338 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.335183 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-dshm\") pod \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " Apr 17 08:18:44.335338 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.335236 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2cee61-5c50-4be7-b2a6-a92410798b4c-tls-certs\") pod \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\" (UID: \"ce2cee61-5c50-4be7-b2a6-a92410798b4c\") " Apr 17 08:18:44.335536 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.335493 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-home" (OuterVolumeSpecName: "home") pod "ce2cee61-5c50-4be7-b2a6-a92410798b4c" (UID: "ce2cee61-5c50-4be7-b2a6-a92410798b4c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:18:44.335587 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.335575 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-model-cache" (OuterVolumeSpecName: "model-cache") pod "ce2cee61-5c50-4be7-b2a6-a92410798b4c" (UID: "ce2cee61-5c50-4be7-b2a6-a92410798b4c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:18:44.337435 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.337397 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2cee61-5c50-4be7-b2a6-a92410798b4c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ce2cee61-5c50-4be7-b2a6-a92410798b4c" (UID: "ce2cee61-5c50-4be7-b2a6-a92410798b4c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:18:44.337563 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.337454 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-dshm" (OuterVolumeSpecName: "dshm") pod "ce2cee61-5c50-4be7-b2a6-a92410798b4c" (UID: "ce2cee61-5c50-4be7-b2a6-a92410798b4c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:18:44.337563 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.337542 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kube-api-access-qp54d" (OuterVolumeSpecName: "kube-api-access-qp54d") pod "ce2cee61-5c50-4be7-b2a6-a92410798b4c" (UID: "ce2cee61-5c50-4be7-b2a6-a92410798b4c"). InnerVolumeSpecName "kube-api-access-qp54d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:18:44.352241 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.352210 2573 generic.go:358] "Generic (PLEG): container finished" podID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerID="aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87" exitCode=0 Apr 17 08:18:44.352365 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.352285 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 17 08:18:44.352365 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.352296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"ce2cee61-5c50-4be7-b2a6-a92410798b4c","Type":"ContainerDied","Data":"aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87"} Apr 17 08:18:44.352365 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.352338 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"ce2cee61-5c50-4be7-b2a6-a92410798b4c","Type":"ContainerDied","Data":"dcd10bd32d8a2296d94562d670c7cd4bc9b566cd44dcb2e7f5ce6fd47fdc402f"} Apr 17 08:18:44.352365 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.352356 2573 scope.go:117] "RemoveContainer" containerID="aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87" Apr 17 08:18:44.380555 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.380531 2573 scope.go:117] "RemoveContainer" containerID="3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc" Apr 17 08:18:44.406651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.406611 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ce2cee61-5c50-4be7-b2a6-a92410798b4c" (UID: "ce2cee61-5c50-4be7-b2a6-a92410798b4c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:18:44.436731 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.436703 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:18:44.436731 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.436730 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:18:44.436915 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.436743 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:18:44.436915 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.436752 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ce2cee61-5c50-4be7-b2a6-a92410798b4c-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:18:44.436915 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.436761 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2cee61-5c50-4be7-b2a6-a92410798b4c-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:18:44.436915 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.436769 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qp54d\" (UniqueName: \"kubernetes.io/projected/ce2cee61-5c50-4be7-b2a6-a92410798b4c-kube-api-access-qp54d\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:18:44.456237 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.456216 2573 scope.go:117] "RemoveContainer" containerID="aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87" Apr 17 08:18:44.456589 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:18:44.456572 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87\": container with ID starting with aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87 not found: ID does not exist" containerID="aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87" Apr 17 08:18:44.456651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.456599 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87"} err="failed to get container status \"aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87\": rpc error: code = NotFound desc = could not find container \"aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87\": container with ID starting with aa4be9693f4414ddd2b4ce5540084f7a3597b86aaf252259bc89caf90a9c6c87 not found: ID does not exist" Apr 17 08:18:44.456651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.456616 2573 scope.go:117] "RemoveContainer" containerID="3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc" Apr 17 08:18:44.456847 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:18:44.456823 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc\": container with ID starting with 3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc not found: ID does not exist" containerID="3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc" Apr 17 08:18:44.456952 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.456849 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc"} err="failed to get container status \"3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc\": rpc error: code = NotFound desc = could not find container \"3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc\": container with ID starting with 3a3b0b977dac840d126afd172c1d46aa9c4e330381f43b21f93cf2321f788cbc not found: ID does not exist" Apr 17 08:18:44.674007 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.673974 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 08:18:44.679833 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.679805 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 17 08:18:44.802831 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:44.802777 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" path="/var/lib/kubelet/pods/ce2cee61-5c50-4be7-b2a6-a92410798b4c/volumes" Apr 17 08:18:45.719223 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:45.719167 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:18:45.745490 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:45.745449 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:18:55.719081 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:55.719030 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:18:55.744928 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:18:55.744887 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:19:05.719136 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:05.719080 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 17 08:19:05.745361 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:05.745320 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 17 08:19:15.727940 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:15.727902 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:19:15.741193 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:15.741171 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:19:15.756513 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:15.756491 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:19:15.763904 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:15.763887 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:19:27.200128 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:27.200092 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc"] Apr 17 08:19:27.200569 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:27.200459 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" containerID="cri-o://92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01" gracePeriod=30 Apr 17 08:19:27.209485 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:27.209457 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm"] Apr 17 08:19:27.210035 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:27.209882 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" containerID="cri-o://2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94" gracePeriod=30 Apr 17 08:19:57.209938 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.209880 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="llm-d-routing-sidecar" containerID="cri-o://bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378" gracePeriod=2 Apr 17 08:19:57.497801 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.497763 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:19:57.500883 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.500861 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm_2dd20ad5-d04a-424d-8da8-24c269e59eac/main/0.log" Apr 17 08:19:57.501467 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.501450 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:19:57.577486 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577458 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-home\") pod \"deabb10c-4e65-4256-b8ac-da6a0c980429\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " Apr 17 08:19:57.577662 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577494 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-dshm\") pod \"2dd20ad5-d04a-424d-8da8-24c269e59eac\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " Apr 17 08:19:57.577662 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577525 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-home\") pod \"2dd20ad5-d04a-424d-8da8-24c269e59eac\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " Apr 17 08:19:57.577662 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577555 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-dshm\") pod \"deabb10c-4e65-4256-b8ac-da6a0c980429\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " Apr 17 08:19:57.577662 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577573 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-kserve-provision-location\") pod \"deabb10c-4e65-4256-b8ac-da6a0c980429\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " Apr 17 08:19:57.577662 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577598 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/deabb10c-4e65-4256-b8ac-da6a0c980429-tls-certs\") pod \"deabb10c-4e65-4256-b8ac-da6a0c980429\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " Apr 17 08:19:57.577662 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577630 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djf4x\" (UniqueName: \"kubernetes.io/projected/deabb10c-4e65-4256-b8ac-da6a0c980429-kube-api-access-djf4x\") pod \"deabb10c-4e65-4256-b8ac-da6a0c980429\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " Apr 17 08:19:57.578002 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577664 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-kserve-provision-location\") pod \"2dd20ad5-d04a-424d-8da8-24c269e59eac\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " Apr 17 08:19:57.578002 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577970 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-home" (OuterVolumeSpecName: "home") pod "deabb10c-4e65-4256-b8ac-da6a0c980429" (UID: "deabb10c-4e65-4256-b8ac-da6a0c980429"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:57.578002 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.577989 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-home" (OuterVolumeSpecName: "home") pod "2dd20ad5-d04a-424d-8da8-24c269e59eac" (UID: "2dd20ad5-d04a-424d-8da8-24c269e59eac"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:57.578208 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.578060 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dd20ad5-d04a-424d-8da8-24c269e59eac-tls-certs\") pod \"2dd20ad5-d04a-424d-8da8-24c269e59eac\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " Apr 17 08:19:57.578208 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.578091 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7bdx\" (UniqueName: \"kubernetes.io/projected/2dd20ad5-d04a-424d-8da8-24c269e59eac-kube-api-access-v7bdx\") pod \"2dd20ad5-d04a-424d-8da8-24c269e59eac\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " Apr 17 08:19:57.578208 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.578173 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-model-cache\") pod \"deabb10c-4e65-4256-b8ac-da6a0c980429\" (UID: \"deabb10c-4e65-4256-b8ac-da6a0c980429\") " Apr 17 08:19:57.578208 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.578203 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-model-cache\") pod \"2dd20ad5-d04a-424d-8da8-24c269e59eac\" (UID: \"2dd20ad5-d04a-424d-8da8-24c269e59eac\") " Apr 17 08:19:57.579071 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.578815 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.579071 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.578841 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-home\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.579252 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.579173 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-model-cache" (OuterVolumeSpecName: "model-cache") pod "2dd20ad5-d04a-424d-8da8-24c269e59eac" (UID: "2dd20ad5-d04a-424d-8da8-24c269e59eac"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:57.579400 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.579371 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-model-cache" (OuterVolumeSpecName: "model-cache") pod "deabb10c-4e65-4256-b8ac-da6a0c980429" (UID: "deabb10c-4e65-4256-b8ac-da6a0c980429"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:57.580384 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.580350 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-dshm" (OuterVolumeSpecName: "dshm") pod "deabb10c-4e65-4256-b8ac-da6a0c980429" (UID: "deabb10c-4e65-4256-b8ac-da6a0c980429"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:57.580384 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.580359 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deabb10c-4e65-4256-b8ac-da6a0c980429-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "deabb10c-4e65-4256-b8ac-da6a0c980429" (UID: "deabb10c-4e65-4256-b8ac-da6a0c980429"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:19:57.580518 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.580387 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-dshm" (OuterVolumeSpecName: "dshm") pod "2dd20ad5-d04a-424d-8da8-24c269e59eac" (UID: "2dd20ad5-d04a-424d-8da8-24c269e59eac"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:57.580668 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.580644 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deabb10c-4e65-4256-b8ac-da6a0c980429-kube-api-access-djf4x" (OuterVolumeSpecName: "kube-api-access-djf4x") pod "deabb10c-4e65-4256-b8ac-da6a0c980429" (UID: "deabb10c-4e65-4256-b8ac-da6a0c980429"). InnerVolumeSpecName "kube-api-access-djf4x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:19:57.581462 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.581436 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd20ad5-d04a-424d-8da8-24c269e59eac-kube-api-access-v7bdx" (OuterVolumeSpecName: "kube-api-access-v7bdx") pod "2dd20ad5-d04a-424d-8da8-24c269e59eac" (UID: "2dd20ad5-d04a-424d-8da8-24c269e59eac"). InnerVolumeSpecName "kube-api-access-v7bdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:19:57.581612 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.581594 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd20ad5-d04a-424d-8da8-24c269e59eac-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2dd20ad5-d04a-424d-8da8-24c269e59eac" (UID: "2dd20ad5-d04a-424d-8da8-24c269e59eac"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:19:57.608391 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.608364 2573 generic.go:358] "Generic (PLEG): container finished" podID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerID="92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01" exitCode=137 Apr 17 08:19:57.608496 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.608415 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" event={"ID":"deabb10c-4e65-4256-b8ac-da6a0c980429","Type":"ContainerDied","Data":"92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01"} Apr 17 08:19:57.608496 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.608454 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" Apr 17 08:19:57.608496 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.608473 2573 scope.go:117] "RemoveContainer" containerID="92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01" Apr 17 08:19:57.608672 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.608459 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc" event={"ID":"deabb10c-4e65-4256-b8ac-da6a0c980429","Type":"ContainerDied","Data":"514a0d848960c5a4bb43254a7914d4f42ec4ef7ebfdb17446ae2399852f52e65"} Apr 17 08:19:57.609961 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.609944 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm_2dd20ad5-d04a-424d-8da8-24c269e59eac/main/0.log" Apr 17 08:19:57.610603 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.610585 2573 generic.go:358] "Generic (PLEG): container finished" podID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerID="2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94" exitCode=137 Apr 17 08:19:57.610698 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.610604 2573 generic.go:358] "Generic (PLEG): container finished" podID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerID="bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378" exitCode=0 Apr 17 08:19:57.610698 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.610633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" event={"ID":"2dd20ad5-d04a-424d-8da8-24c269e59eac","Type":"ContainerDied","Data":"2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94"} Apr 17 08:19:57.610698 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.610656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" event={"ID":"2dd20ad5-d04a-424d-8da8-24c269e59eac","Type":"ContainerDied","Data":"bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378"} Apr 17 08:19:57.610698 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.610662 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" Apr 17 08:19:57.610698 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.610673 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm" event={"ID":"2dd20ad5-d04a-424d-8da8-24c269e59eac","Type":"ContainerDied","Data":"6dee9cc20f74d3cd22b226f0e745a4f8a04ca174a70f533ff61bbcc4d9f3f1f2"} Apr 17 08:19:57.637020 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.636985 2573 scope.go:117] "RemoveContainer" containerID="bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264" Apr 17 08:19:57.659630 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.659600 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "deabb10c-4e65-4256-b8ac-da6a0c980429" (UID: "deabb10c-4e65-4256-b8ac-da6a0c980429"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:57.662765 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.662736 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2dd20ad5-d04a-424d-8da8-24c269e59eac" (UID: "2dd20ad5-d04a-424d-8da8-24c269e59eac"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:57.679658 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679633 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-djf4x\" (UniqueName: \"kubernetes.io/projected/deabb10c-4e65-4256-b8ac-da6a0c980429-kube-api-access-djf4x\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.679723 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679657 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.679723 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679678 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dd20ad5-d04a-424d-8da8-24c269e59eac-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.679723 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679688 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v7bdx\" (UniqueName: \"kubernetes.io/projected/2dd20ad5-d04a-424d-8da8-24c269e59eac-kube-api-access-v7bdx\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.679723 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679704 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.679723 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679715 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-model-cache\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.679723 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679722 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dd20ad5-d04a-424d-8da8-24c269e59eac-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.679933 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679731 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-dshm\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.679933 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679740 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/deabb10c-4e65-4256-b8ac-da6a0c980429-kserve-provision-location\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.679933 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.679748 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/deabb10c-4e65-4256-b8ac-da6a0c980429-tls-certs\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:19:57.712362 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.712339 2573 scope.go:117] "RemoveContainer" containerID="92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01" Apr 17 08:19:57.712710 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:19:57.712687 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01\": container with ID starting with 92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01 not found: ID does not exist" containerID="92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01" Apr 17 08:19:57.712772 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.712730 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01"} err="failed to get container status \"92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01\": rpc error: code = NotFound desc = could not find container \"92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01\": container with ID starting with 92519481f6b70080bf4a69c11862d4d6ee72f3a863bb93f20619bfcdd7f2cd01 not found: ID does not exist" Apr 17 08:19:57.712772 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.712751 2573 scope.go:117] "RemoveContainer" containerID="bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264" Apr 17 08:19:57.713076 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:19:57.713058 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264\": container with ID starting with bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264 not found: ID does not exist" containerID="bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264" Apr 17 08:19:57.713132 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.713085 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264"} err="failed to get container status \"bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264\": rpc error: code = NotFound desc = could not find container \"bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264\": container with ID starting with bbd60ad2c5d7e422e1ccef6e6459e6eb16819e298b87fedd91ed8c5d60bd6264 not found: ID does not exist" Apr 17 08:19:57.713132 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.713109 2573 scope.go:117] "RemoveContainer" containerID="2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94" Apr 17 08:19:57.733511 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.733494 2573 scope.go:117] "RemoveContainer" containerID="e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b" Apr 17 08:19:57.800444 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.800426 2573 scope.go:117] "RemoveContainer" containerID="bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378" Apr 17 08:19:57.807558 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.807535 2573 scope.go:117] "RemoveContainer" containerID="2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94" Apr 17 08:19:57.807877 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:19:57.807852 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94\": container with ID starting with 2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94 not found: ID does not exist" containerID="2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94" Apr 17 08:19:57.807964 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.807884 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94"} err="failed to get container status \"2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94\": rpc error: code = NotFound desc = could not find container \"2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94\": container with ID starting with 2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94 not found: ID does not exist" Apr 17 08:19:57.807964 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.807906 2573 scope.go:117] "RemoveContainer" containerID="e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b" Apr 17 08:19:57.808154 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:19:57.808135 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b\": container with ID starting with e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b not found: ID does not exist" containerID="e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b" Apr 17 08:19:57.808201 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.808160 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b"} err="failed to get container status \"e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b\": rpc error: code = NotFound desc = could not find container \"e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b\": container with ID starting with e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b not found: ID does not exist" Apr 17 08:19:57.808201 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.808178 2573 scope.go:117] "RemoveContainer" containerID="bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378" Apr 17 08:19:57.808428 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:19:57.808405 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378\": container with ID starting with bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378 not found: ID does not exist" containerID="bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378" Apr 17 08:19:57.808479 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.808437 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378"} err="failed to get container status \"bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378\": rpc error: code = NotFound desc = could not find container \"bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378\": container with ID starting with bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378 not found: ID does not exist" Apr 17 08:19:57.808479 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.808453 2573 scope.go:117] "RemoveContainer" containerID="2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94" Apr 17 08:19:57.808674 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.808654 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94"} err="failed to get container status \"2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94\": rpc error: code = NotFound desc = could not find container \"2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94\": container with ID starting with 2a487397fb987e2d44e66705c1c99b9fba324bb0fae8a57dc5feed44c7ac0a94 not found: ID does not exist" Apr 17 08:19:57.808722 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.808675 2573 scope.go:117] "RemoveContainer" containerID="e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b" Apr 17 08:19:57.808972 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.808953 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b"} err="failed to get container status \"e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b\": rpc error: code = NotFound desc = could not find container \"e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b\": container with ID starting with e4bb1d79fe20efdf6ed4445f3b4168457276fb5951b1910216ed54099a2f553b not found: ID does not exist" Apr 17 08:19:57.808972 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.808973 2573 scope.go:117] "RemoveContainer" containerID="bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378" Apr 17 08:19:57.809190 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.809169 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378"} err="failed to get container status \"bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378\": rpc error: code = NotFound desc = could not find container \"bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378\": container with ID starting with bf26487119e60ce106740509d2660d6ae06708985691f71bf3a635b903c77378 not found: ID does not exist" Apr 17 08:19:57.933636 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.933604 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc"] Apr 17 08:19:57.937730 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.937699 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-749cdc56f-q6ttc"] Apr 17 08:19:57.949325 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.949304 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm"] Apr 17 08:19:57.956948 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:57.956924 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6f688d6ccb-qjxsm"] Apr 17 08:19:58.802586 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:58.802552 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" path="/var/lib/kubelet/pods/2dd20ad5-d04a-424d-8da8-24c269e59eac/volumes" Apr 17 08:19:58.803052 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:19:58.803038 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" path="/var/lib/kubelet/pods/deabb10c-4e65-4256-b8ac-da6a0c980429/volumes" Apr 17 08:21:56.833756 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:21:56.833725 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:21:56.838054 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:21:56.838037 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:22:10.133149 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133112 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmsp4/must-gather-q2vzk"] Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133432 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133443 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133458 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="llm-d-routing-sidecar" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133465 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="llm-d-routing-sidecar" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133471 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="llm-d-routing-sidecar" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133479 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="llm-d-routing-sidecar" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133486 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="storage-initializer" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133492 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="storage-initializer" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133498 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133502 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133510 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="storage-initializer" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133516 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="storage-initializer" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133523 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" Apr 17 08:22:10.133520 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133528 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133534 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="storage-initializer" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133540 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="storage-initializer" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133551 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133556 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133562 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133567 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133574 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="storage-initializer" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133579 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="storage-initializer" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133586 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="storage-initializer" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133590 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="storage-initializer" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133638 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="llm-d-routing-sidecar" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133650 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="main" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133656 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fcaffb1-59dc-4fc4-8326-c45b11f3b0b9" containerName="llm-d-routing-sidecar" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133662 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce2cee61-5c50-4be7-b2a6-a92410798b4c" containerName="main" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133668 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="deabb10c-4e65-4256-b8ac-da6a0c980429" containerName="main" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133673 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="38e5717f-cc15-46bf-ad46-18e2bee0f699" containerName="main" Apr 17 08:22:10.134074 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.133678 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2dd20ad5-d04a-424d-8da8-24c269e59eac" containerName="main" Apr 17 08:22:10.137076 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.137055 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:10.139917 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.139898 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmsp4\"/\"openshift-service-ca.crt\"" Apr 17 08:22:10.140024 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.139986 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmsp4\"/\"default-dockercfg-xspm7\"" Apr 17 08:22:10.140110 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.140097 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmsp4\"/\"kube-root-ca.crt\"" Apr 17 08:22:10.143858 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.143835 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmsp4/must-gather-q2vzk"] Apr 17 08:22:10.270094 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.270060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-must-gather-output\") pod \"must-gather-q2vzk\" (UID: \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\") " pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:10.270242 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.270168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbsck\" (UniqueName: \"kubernetes.io/projected/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-kube-api-access-rbsck\") pod \"must-gather-q2vzk\" (UID: \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\") " pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:10.371228 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.371196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbsck\" (UniqueName: \"kubernetes.io/projected/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-kube-api-access-rbsck\") pod \"must-gather-q2vzk\" (UID: \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\") " pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:10.371376 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.371277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-must-gather-output\") pod \"must-gather-q2vzk\" (UID: \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\") " pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:10.371589 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.371574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-must-gather-output\") pod \"must-gather-q2vzk\" (UID: \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\") " pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:10.382674 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.382653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbsck\" (UniqueName: \"kubernetes.io/projected/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-kube-api-access-rbsck\") pod \"must-gather-q2vzk\" (UID: \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\") " pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:10.447288 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.447216 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:10.570966 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.570929 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmsp4/must-gather-q2vzk"] Apr 17 08:22:10.573342 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:22:10.573311 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28bf6bfa_41ca_4972_9bec_b32f9fdaf678.slice/crio-13e00d81e0c3a2f4784bb2fd1570f6e2e2adf174fae441e4c87e22d64e068639 WatchSource:0}: Error finding container 13e00d81e0c3a2f4784bb2fd1570f6e2e2adf174fae441e4c87e22d64e068639: Status 404 returned error can't find the container with id 13e00d81e0c3a2f4784bb2fd1570f6e2e2adf174fae441e4c87e22d64e068639 Apr 17 08:22:10.574860 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:10.574840 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:22:11.048394 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:11.048363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" event={"ID":"28bf6bfa-41ca-4972-9bec-b32f9fdaf678","Type":"ContainerStarted","Data":"13e00d81e0c3a2f4784bb2fd1570f6e2e2adf174fae441e4c87e22d64e068639"} Apr 17 08:22:16.070473 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:16.070441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" event={"ID":"28bf6bfa-41ca-4972-9bec-b32f9fdaf678","Type":"ContainerStarted","Data":"34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690"} Apr 17 08:22:16.070473 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:16.070477 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" event={"ID":"28bf6bfa-41ca-4972-9bec-b32f9fdaf678","Type":"ContainerStarted","Data":"6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0"} Apr 17 08:22:16.087377 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:16.087326 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" podStartSLOduration=1.626261941 podStartE2EDuration="6.087313243s" podCreationTimestamp="2026-04-17 08:22:10 +0000 UTC" firstStartedPulling="2026-04-17 08:22:10.574968546 +0000 UTC m=+1814.321025322" lastFinishedPulling="2026-04-17 08:22:15.036019848 +0000 UTC m=+1818.782076624" observedRunningTime="2026-04-17 08:22:16.086485016 +0000 UTC m=+1819.832541835" watchObservedRunningTime="2026-04-17 08:22:16.087313243 +0000 UTC m=+1819.833370037" Apr 17 08:22:40.214154 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:40.214126 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-bb2rq_992920b3-1fa1-430b-a57d-ddc1fa16986c/manager/0.log" Apr 17 08:22:40.287472 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:40.287439 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-c7cs2_d348c7b3-e753-4884-83ae-71ef67146d47/limitador/0.log" Apr 17 08:22:40.302715 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:40.302691 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rpjbr_178d1a1b-5cd7-447e-8e8a-b24968742805/manager/0.log" Apr 17 08:22:41.162942 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:41.162903 2573 generic.go:358] "Generic (PLEG): container finished" podID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" containerID="6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0" exitCode=0 Apr 17 08:22:41.163155 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:41.162974 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" event={"ID":"28bf6bfa-41ca-4972-9bec-b32f9fdaf678","Type":"ContainerDied","Data":"6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0"} Apr 17 08:22:41.163324 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:41.163310 2573 scope.go:117] "RemoveContainer" containerID="6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0" Apr 17 08:22:41.958678 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:41.958651 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tmsp4_must-gather-q2vzk_28bf6bfa-41ca-4972-9bec-b32f9fdaf678/gather/0.log" Apr 17 08:22:45.292966 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:45.292940 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fw888_03f66286-e29a-494b-a307-9a269b5cd89f/global-pull-secret-syncer/0.log" Apr 17 08:22:45.350383 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:45.350357 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dqw8q_b7452b81-2d05-443f-9a0b-287e9bb664d2/konnectivity-agent/0.log" Apr 17 08:22:45.419002 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:45.418973 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-8.ec2.internal_f695ff27a724f87f07e7f9438b811560/haproxy/0.log" Apr 17 08:22:47.449426 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.449381 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tmsp4/must-gather-q2vzk"] Apr 17 08:22:47.449889 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.449649 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" containerName="copy" containerID="cri-o://34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690" gracePeriod=2 Apr 17 08:22:47.451212 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.451185 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tmsp4/must-gather-q2vzk"] Apr 17 08:22:47.452044 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.452013 2573 status_manager.go:895] "Failed to get status for pod" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" err="pods \"must-gather-q2vzk\" is forbidden: User \"system:node:ip-10-0-137-8.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-tmsp4\": no relationship found between node 'ip-10-0-137-8.ec2.internal' and this object" Apr 17 08:22:47.690257 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.690235 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tmsp4_must-gather-q2vzk_28bf6bfa-41ca-4972-9bec-b32f9fdaf678/copy/0.log" Apr 17 08:22:47.690592 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.690577 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:47.692756 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.692732 2573 status_manager.go:895] "Failed to get status for pod" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" err="pods \"must-gather-q2vzk\" is forbidden: User \"system:node:ip-10-0-137-8.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-tmsp4\": no relationship found between node 'ip-10-0-137-8.ec2.internal' and this object" Apr 17 08:22:47.805199 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.805130 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbsck\" (UniqueName: \"kubernetes.io/projected/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-kube-api-access-rbsck\") pod \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\" (UID: \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\") " Apr 17 08:22:47.805199 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.805175 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-must-gather-output\") pod \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\" (UID: \"28bf6bfa-41ca-4972-9bec-b32f9fdaf678\") " Apr 17 08:22:47.807332 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.807306 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-kube-api-access-rbsck" (OuterVolumeSpecName: "kube-api-access-rbsck") pod "28bf6bfa-41ca-4972-9bec-b32f9fdaf678" (UID: "28bf6bfa-41ca-4972-9bec-b32f9fdaf678"). InnerVolumeSpecName "kube-api-access-rbsck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:22:47.811023 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.810991 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "28bf6bfa-41ca-4972-9bec-b32f9fdaf678" (UID: "28bf6bfa-41ca-4972-9bec-b32f9fdaf678"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:22:47.905684 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.905657 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbsck\" (UniqueName: \"kubernetes.io/projected/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-kube-api-access-rbsck\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:22:47.905684 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:47.905681 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/28bf6bfa-41ca-4972-9bec-b32f9fdaf678-must-gather-output\") on node \"ip-10-0-137-8.ec2.internal\" DevicePath \"\"" Apr 17 08:22:48.192314 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.192279 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tmsp4_must-gather-q2vzk_28bf6bfa-41ca-4972-9bec-b32f9fdaf678/copy/0.log" Apr 17 08:22:48.192664 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.192640 2573 generic.go:358] "Generic (PLEG): container finished" podID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" containerID="34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690" exitCode=143 Apr 17 08:22:48.192764 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.192693 2573 scope.go:117] "RemoveContainer" containerID="34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690" Apr 17 08:22:48.192764 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.192712 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" Apr 17 08:22:48.195293 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.195262 2573 status_manager.go:895] "Failed to get status for pod" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" err="pods \"must-gather-q2vzk\" is forbidden: User \"system:node:ip-10-0-137-8.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-tmsp4\": no relationship found between node 'ip-10-0-137-8.ec2.internal' and this object" Apr 17 08:22:48.201091 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.200948 2573 scope.go:117] "RemoveContainer" containerID="6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0" Apr 17 08:22:48.203651 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.203619 2573 status_manager.go:895] "Failed to get status for pod" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" pod="openshift-must-gather-tmsp4/must-gather-q2vzk" err="pods \"must-gather-q2vzk\" is forbidden: User \"system:node:ip-10-0-137-8.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-tmsp4\": no relationship found between node 'ip-10-0-137-8.ec2.internal' and this object" Apr 17 08:22:48.214750 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.214733 2573 scope.go:117] "RemoveContainer" containerID="34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690" Apr 17 08:22:48.215026 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:22:48.215009 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690\": container with ID starting with 34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690 not found: ID does not exist" containerID="34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690" Apr 17 08:22:48.215077 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.215034 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690"} err="failed to get container status \"34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690\": rpc error: code = NotFound desc = could not find container \"34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690\": container with ID starting with 34dc8b2eb25304a26df884b74dd30d664ec9d3a899aecec37a69854ff02ab690 not found: ID does not exist" Apr 17 08:22:48.215077 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.215050 2573 scope.go:117] "RemoveContainer" containerID="6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0" Apr 17 08:22:48.215295 ip-10-0-137-8 kubenswrapper[2573]: E0417 08:22:48.215274 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0\": container with ID starting with 6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0 not found: ID does not exist" containerID="6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0" Apr 17 08:22:48.215364 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.215305 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0"} err="failed to get container status \"6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0\": rpc error: code = NotFound desc = could not find container \"6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0\": container with ID starting with 6b08fc68611e03950f499dbd14957f37aeb2cac72814e68059aef353f1f95ca0 not found: ID does not exist" Apr 17 08:22:48.802900 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:48.802872 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" path="/var/lib/kubelet/pods/28bf6bfa-41ca-4972-9bec-b32f9fdaf678/volumes" Apr 17 08:22:49.471932 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:49.471901 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-bb2rq_992920b3-1fa1-430b-a57d-ddc1fa16986c/manager/0.log" Apr 17 08:22:49.561355 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:49.561330 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-c7cs2_d348c7b3-e753-4884-83ae-71ef67146d47/limitador/0.log" Apr 17 08:22:49.587856 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:49.587830 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rpjbr_178d1a1b-5cd7-447e-8e8a-b24968742805/manager/0.log" Apr 17 08:22:50.722748 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:50.722721 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-6c5tk_07cc794d-dfcf-4290-beb1-51ec803617e1/cluster-monitoring-operator/0.log" Apr 17 08:22:50.742849 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:50.742826 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-58jwp_55f08fcd-4e31-46fa-be87-935efefdb3d8/kube-state-metrics/0.log" Apr 17 08:22:50.761703 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:50.761679 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-58jwp_55f08fcd-4e31-46fa-be87-935efefdb3d8/kube-rbac-proxy-main/0.log" Apr 17 08:22:50.783666 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:50.783646 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-58jwp_55f08fcd-4e31-46fa-be87-935efefdb3d8/kube-rbac-proxy-self/0.log" Apr 17 08:22:50.806724 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:50.806705 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-64978bc4d-47kr9_4d11a1ca-4f07-474f-a603-a2cf5a182233/metrics-server/0.log" Apr 17 08:22:50.830295 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:50.830275 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-q6kz8_33b34eda-688e-48f5-83f8-7589fa815bfa/monitoring-plugin/0.log" Apr 17 08:22:50.993691 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:50.993626 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tvlxz_ae2e1e06-a904-48d8-85a3-5aacdc99b560/node-exporter/0.log" Apr 17 08:22:51.013987 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:51.013969 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tvlxz_ae2e1e06-a904-48d8-85a3-5aacdc99b560/kube-rbac-proxy/0.log" Apr 17 08:22:51.033145 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:51.033129 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tvlxz_ae2e1e06-a904-48d8-85a3-5aacdc99b560/init-textfile/0.log" Apr 17 08:22:51.144865 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:51.144846 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0967a60a-1d6e-48fc-b498-bd55233e70b0/prometheus/0.log" Apr 17 08:22:51.160767 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:51.160749 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0967a60a-1d6e-48fc-b498-bd55233e70b0/config-reloader/0.log" Apr 17 08:22:51.180532 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:51.180516 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0967a60a-1d6e-48fc-b498-bd55233e70b0/thanos-sidecar/0.log" Apr 17 08:22:51.198995 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:51.198977 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0967a60a-1d6e-48fc-b498-bd55233e70b0/kube-rbac-proxy-web/0.log" Apr 17 08:22:51.218859 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:51.218835 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0967a60a-1d6e-48fc-b498-bd55233e70b0/kube-rbac-proxy/0.log" Apr 17 08:22:51.238185 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:51.238166 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0967a60a-1d6e-48fc-b498-bd55233e70b0/kube-rbac-proxy-thanos/0.log" Apr 17 08:22:51.258499 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:51.258446 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0967a60a-1d6e-48fc-b498-bd55233e70b0/init-config-reloader/0.log" Apr 17 08:22:52.818510 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:52.818481 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-lmh5b_9ab094e1-f37a-480c-ba6e-88c223afc6fb/networking-console-plugin/0.log" Apr 17 08:22:53.903470 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.903435 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg"] Apr 17 08:22:53.903963 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.903942 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" containerName="gather" Apr 17 08:22:53.904039 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.903967 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" containerName="gather" Apr 17 08:22:53.904039 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.903978 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" containerName="copy" Apr 17 08:22:53.904039 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.903985 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" containerName="copy" Apr 17 08:22:53.904198 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.904077 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" containerName="gather" Apr 17 08:22:53.904198 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.904088 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="28bf6bfa-41ca-4972-9bec-b32f9fdaf678" containerName="copy" Apr 17 08:22:53.907302 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.907281 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:53.909855 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.909834 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jc8j6\"/\"kube-root-ca.crt\"" Apr 17 08:22:53.909941 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.909840 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jc8j6\"/\"openshift-service-ca.crt\"" Apr 17 08:22:53.910863 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.910850 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jc8j6\"/\"default-dockercfg-tn4lx\"" Apr 17 08:22:53.913327 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:53.913307 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg"] Apr 17 08:22:54.056633 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.056608 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhpj\" (UniqueName: \"kubernetes.io/projected/b6172152-0d6f-4a85-8837-f43271e02271-kube-api-access-kjhpj\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.056768 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.056662 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-lib-modules\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.056768 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.056747 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-sys\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.056880 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.056805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-podres\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.056880 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.056833 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-proc\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.158222 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.158152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-lib-modules\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.158222 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.158193 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-sys\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.158434 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.158225 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-podres\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.158434 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.158248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-proc\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.158434 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.158286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-sys\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.158434 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.158289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhpj\" (UniqueName: \"kubernetes.io/projected/b6172152-0d6f-4a85-8837-f43271e02271-kube-api-access-kjhpj\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.158434 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.158341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-proc\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.158434 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.158343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-lib-modules\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.158434 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.158355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b6172152-0d6f-4a85-8837-f43271e02271-podres\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.166761 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.166730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhpj\" (UniqueName: \"kubernetes.io/projected/b6172152-0d6f-4a85-8837-f43271e02271-kube-api-access-kjhpj\") pod \"perf-node-gather-daemonset-9dqkg\" (UID: \"b6172152-0d6f-4a85-8837-f43271e02271\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.217895 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.217875 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:54.338943 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.338911 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg"] Apr 17 08:22:54.342102 ip-10-0-137-8 kubenswrapper[2573]: W0417 08:22:54.342077 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb6172152_0d6f_4a85_8837_f43271e02271.slice/crio-e89396f1d63d2a5d4496710df02973ef6c2d036da1a8fcd654c0e34d9a6d160e WatchSource:0}: Error finding container e89396f1d63d2a5d4496710df02973ef6c2d036da1a8fcd654c0e34d9a6d160e: Status 404 returned error can't find the container with id e89396f1d63d2a5d4496710df02973ef6c2d036da1a8fcd654c0e34d9a6d160e Apr 17 08:22:54.917566 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.917536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jvx7c_a31b36b1-77de-4517-8c23-566021eb1d32/dns/0.log" Apr 17 08:22:54.935488 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:54.935466 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jvx7c_a31b36b1-77de-4517-8c23-566021eb1d32/kube-rbac-proxy/0.log" Apr 17 08:22:55.015087 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:55.015062 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r5k8g_4c7d0c52-01d6-4b13-b631-cd9e35e59fa6/dns-node-resolver/0.log" Apr 17 08:22:55.215849 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:55.215757 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" event={"ID":"b6172152-0d6f-4a85-8837-f43271e02271","Type":"ContainerStarted","Data":"ddf958edccbc702f1ff3f4107d19710e0b8fbf66ca1e21f2bf8f4ce9816b80c5"} Apr 17 08:22:55.215849 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:55.215801 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" event={"ID":"b6172152-0d6f-4a85-8837-f43271e02271","Type":"ContainerStarted","Data":"e89396f1d63d2a5d4496710df02973ef6c2d036da1a8fcd654c0e34d9a6d160e"} Apr 17 08:22:55.216035 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:55.215896 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:22:55.232816 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:55.232758 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" podStartSLOduration=2.23274756 podStartE2EDuration="2.23274756s" podCreationTimestamp="2026-04-17 08:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:22:55.230778166 +0000 UTC m=+1858.976834960" watchObservedRunningTime="2026-04-17 08:22:55.23274756 +0000 UTC m=+1858.978804353" Apr 17 08:22:55.473739 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:55.473659 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-549bbc77ff-hvf4f_fd080088-4dd5-4cdc-94f9-ebe6ce802c67/registry/0.log" Apr 17 08:22:55.513137 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:55.513108 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kspjc_d3454833-6f08-4cd5-9692-e10872c4ec39/node-ca/0.log" Apr 17 08:22:56.845985 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:56.845952 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cj7gw_faa9121c-e579-414f-8d7d-77beba5608ea/serve-healthcheck-canary/0.log" Apr 17 08:22:57.443922 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:57.443894 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wshg8_b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f/kube-rbac-proxy/0.log" Apr 17 08:22:57.462335 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:57.462313 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wshg8_b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f/exporter/0.log" Apr 17 08:22:57.485318 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:57.485289 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wshg8_b76c5e85-a5e6-4e58-9aa3-f2ffe3d27e1f/extractor/0.log" Apr 17 08:22:59.989918 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:22:59.989890 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5c76446df-cw2ch_3c8ec095-5266-47dc-bd56-c3e429562206/manager/0.log" Apr 17 08:23:00.855707 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:00.855679 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-mbcj2_9e59cfc6-5f20-419b-aab8-74be02eb7d00/seaweedfs/0.log" Apr 17 08:23:01.229015 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:01.228984 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-9dqkg" Apr 17 08:23:05.292710 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:05.292678 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-x4spq_35b0b385-72ff-4b20-9adc-aaed0ec7e41e/migrator/0.log" Apr 17 08:23:05.314016 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:05.313989 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-x4spq_35b0b385-72ff-4b20-9adc-aaed0ec7e41e/graceful-termination/0.log" Apr 17 08:23:06.846122 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:06.846094 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-crv6m_b4eb62e2-ab98-4772-9149-6a8a3cd016b6/kube-multus-additional-cni-plugins/0.log" Apr 17 08:23:06.864385 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:06.864357 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-crv6m_b4eb62e2-ab98-4772-9149-6a8a3cd016b6/egress-router-binary-copy/0.log" Apr 17 08:23:06.882516 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:06.882489 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-crv6m_b4eb62e2-ab98-4772-9149-6a8a3cd016b6/cni-plugins/0.log" Apr 17 08:23:06.900033 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:06.900009 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-crv6m_b4eb62e2-ab98-4772-9149-6a8a3cd016b6/bond-cni-plugin/0.log" Apr 17 08:23:06.920097 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:06.920080 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-crv6m_b4eb62e2-ab98-4772-9149-6a8a3cd016b6/routeoverride-cni/0.log" Apr 17 08:23:06.940957 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:06.940936 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-crv6m_b4eb62e2-ab98-4772-9149-6a8a3cd016b6/whereabouts-cni-bincopy/0.log" Apr 17 08:23:06.959559 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:06.959536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-crv6m_b4eb62e2-ab98-4772-9149-6a8a3cd016b6/whereabouts-cni/0.log" Apr 17 08:23:07.014451 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:07.014423 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hvczg_5c58d588-382f-46d8-be38-9af05f699f8f/kube-multus/0.log" Apr 17 08:23:07.064337 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:07.064312 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ht68l_341e9133-613e-45d4-bb0a-a187c93be340/network-metrics-daemon/0.log" Apr 17 08:23:07.081642 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:07.081622 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ht68l_341e9133-613e-45d4-bb0a-a187c93be340/kube-rbac-proxy/0.log" Apr 17 08:23:07.960439 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:07.960369 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-controller/0.log" Apr 17 08:23:07.981073 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:07.981017 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/0.log" Apr 17 08:23:07.989649 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:07.989633 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovn-acl-logging/1.log" Apr 17 08:23:08.011875 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:08.011857 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/kube-rbac-proxy-node/0.log" Apr 17 08:23:08.047205 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:08.047184 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:23:08.069805 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:08.069771 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/northd/0.log" Apr 17 08:23:08.097683 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:08.097665 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/nbdb/0.log" Apr 17 08:23:08.119753 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:08.119734 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/sbdb/0.log" Apr 17 08:23:08.222769 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:08.222705 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2vdv_cf2999c2-b9c3-4067-b076-2b30bde1888e/ovnkube-controller/0.log" Apr 17 08:23:09.742819 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:09.742772 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-6ldqm_f5b9d306-bf9d-4e85-b6da-267a2fd97905/check-endpoints/0.log" Apr 17 08:23:09.784907 ip-10-0-137-8 kubenswrapper[2573]: I0417 08:23:09.784880 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-scd9x_dd860804-99b0-4bb4-9784-21b0e42ce760/network-check-target-container/0.log"