Apr 16 13:56:48.303654 ip-10-0-131-99 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:48.303668 ip-10-0-131-99 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:48.303678 ip-10-0-131-99 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:48.303996 ip-10-0-131-99 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:56:58.367812 ip-10-0-131-99 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:56:58.367830 ip-10-0-131-99 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4cd2ed0709474294a36c8c35ccc61b8c -- Apr 16 13:59:13.245355 ip-10-0-131-99 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:13.727755 ip-10-0-131-99 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:13.727755 ip-10-0-131-99 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:13.727755 ip-10-0-131-99 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:13.727755 ip-10-0-131-99 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:13.727755 ip-10-0-131-99 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:13.731374 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.731286 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:13.738447 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738418 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:13.738447 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738441 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:13.738447 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738445 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:13.738447 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738448 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:13.738447 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738451 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:13.738447 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738454 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:13.738447 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738459 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738462 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738465 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738468 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738471 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738475 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738478 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738480 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738483 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738486 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738488 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738491 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738494 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738496 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738499 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738502 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738504 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738507 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738510 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738512 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:13.738712 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738515 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738518 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738520 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738523 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738526 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738528 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738531 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738534 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738536 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738547 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738553 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738558 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738560 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738569 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738575 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738579 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738582 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738585 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:13.739224 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738588 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738592 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738595 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738598 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738601 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738603 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738606 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738609 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738612 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738615 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738618 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738621 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738624 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738627 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738630 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738632 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738635 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738637 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738640 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738643 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:13.739698 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738645 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738648 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738651 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738654 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738656 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738659 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738662 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738665 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738668 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738671 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738674 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738677 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738679 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738681 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738685 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738688 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738690 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738693 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738695 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738698 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:13.740188 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738701 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.738705 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739133 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739139 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739142 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739145 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739148 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739150 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739153 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739156 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739159 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739161 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739164 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739167 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739169 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739172 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739175 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739177 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739180 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739182 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:13.740665 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739187 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739190 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739193 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739195 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739198 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739200 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739203 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739206 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739209 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739211 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739214 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739216 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739219 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739221 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739224 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739227 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739229 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739233 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739235 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739238 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:13.741189 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739241 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739243 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739246 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739248 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739251 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739253 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739256 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739259 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739263 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739265 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739268 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739271 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739275 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739278 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739282 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739284 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739287 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739289 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739292 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:13.741706 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739295 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739298 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739301 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739303 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739306 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739308 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739311 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739314 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739316 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739320 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739324 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739327 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739330 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739333 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739336 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739338 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739341 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739344 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739346 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:13.742184 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739351 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739354 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739356 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739359 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739361 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739364 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739367 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739369 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739372 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.739375 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740772 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740796 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740803 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740808 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740814 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740817 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740822 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740827 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740830 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740833 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740837 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:13.742657 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740841 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740844 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740847 2571 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740850 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740853 2571 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740856 2571 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740859 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740862 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740867 2571 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740870 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740874 2571 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740877 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740881 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740886 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740889 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740892 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740895 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740898 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740901 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740904 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740907 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740910 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740914 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740917 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740920 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:13.743194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740923 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740927 2571 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740930 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740934 2571 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740937 2571 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740941 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740944 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740947 2571 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740950 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740953 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740956 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740959 2571 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740962 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740965 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740968 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740972 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740975 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740978 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740981 2571 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740985 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740988 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740991 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740995 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.740998 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741001 2571 flags.go:64] FLAG: --help="false" Apr 16 13:59:13.743801 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741004 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-131-99.ec2.internal" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741007 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741010 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741013 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741016 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741020 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741022 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741025 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741028 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741032 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741035 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741038 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741040 2571 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741044 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741046 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741049 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741052 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741055 2571 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741058 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741061 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741064 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741086 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741090 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741095 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:13.744418 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741098 2571 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741102 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741105 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741108 2571 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741111 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741116 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741120 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741128 2571 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741131 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741134 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741138 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741141 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741144 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741147 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741151 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741158 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741161 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741165 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741169 2571 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741172 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741177 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741180 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741183 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741186 2571 flags.go:64] FLAG: --port="10250" Apr 16 13:59:13.744991 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741190 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741193 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b2f9c2e031b93bb9" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741196 2571 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741199 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741203 2571 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741206 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741210 2571 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741214 2571 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741217 2571 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741220 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741223 2571 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741226 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741230 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741233 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741235 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741238 2571 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741241 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741245 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741248 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741250 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741253 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741256 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741260 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741263 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741266 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741268 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:13.745594 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741271 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741275 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741278 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741281 2571 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741284 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741290 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741292 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741295 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741300 2571 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741303 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741306 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741309 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741312 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741316 2571 flags.go:64] FLAG: --v="2" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741320 2571 flags.go:64] FLAG: --version="false" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741324 2571 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741328 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741331 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741439 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741444 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741447 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741449 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741453 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741456 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:13.746234 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741458 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741461 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741464 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741466 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741469 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741471 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741474 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741476 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741479 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741481 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741484 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741487 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741489 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741492 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741494 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741497 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741499 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741502 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741504 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:13.746866 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741507 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741512 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741515 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741518 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741520 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741523 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741525 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741528 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741531 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741535 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741538 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741540 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741543 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741545 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741548 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741550 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741553 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741556 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741558 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741561 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:13.747405 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741564 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741566 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741569 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741572 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741575 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741579 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741582 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741585 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741587 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741590 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741592 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741595 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741598 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741605 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741607 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741610 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741613 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741615 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741618 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:13.747904 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741621 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741623 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741626 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741629 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741631 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741634 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741637 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741639 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741642 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741644 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741647 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741649 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741652 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741654 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741657 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741659 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741662 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741664 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741667 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741670 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:13.748382 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741672 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.741675 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.741683 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.748116 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.748140 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748189 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748195 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748200 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748204 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748207 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748210 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748213 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748216 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748219 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748221 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:13.748883 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748224 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748227 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748230 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748232 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748235 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748238 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748240 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748243 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748246 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748248 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748251 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748253 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748256 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748259 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748261 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748264 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748267 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748269 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748272 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748274 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:13.749273 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748277 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748281 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748284 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748287 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748290 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748292 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748295 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748297 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748300 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748303 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748305 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748307 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748310 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748313 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748316 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748318 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748320 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748323 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748325 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748328 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:13.749764 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748331 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748333 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748335 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748338 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748340 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748343 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748345 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748348 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748350 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748353 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748356 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748358 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748361 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748364 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748368 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748371 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748374 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748376 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748379 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:13.750263 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748383 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748387 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748390 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748393 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748396 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748399 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748401 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748404 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748407 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748409 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748412 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748414 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748417 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748419 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748422 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748424 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:13.750865 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748427 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.748432 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748544 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748549 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748552 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748555 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748558 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748561 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748564 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748567 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748570 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748572 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748575 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748578 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748581 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748584 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:13.751276 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748586 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748589 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748591 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748594 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748597 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748599 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748602 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748604 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748607 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748609 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748612 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748615 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748617 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748620 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748623 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748625 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748628 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748630 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748633 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748635 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:13.751718 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748638 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748640 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748643 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748645 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748647 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748650 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748653 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748656 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748658 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748661 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748664 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748667 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748669 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748672 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748674 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748677 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748679 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748681 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748684 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748686 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:13.752229 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748689 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748691 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748695 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748699 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748703 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748706 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748710 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748713 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748716 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748718 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748721 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748724 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748727 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748729 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748731 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748734 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748737 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748739 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748742 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:13.752724 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748745 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748747 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748750 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748753 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748756 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748758 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748760 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748763 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748765 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748768 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748771 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748773 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:13.748776 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.748781 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.749566 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:13.753247 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.751470 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:13.753617 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.752368 2571 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:13.753617 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.752462 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:13.753617 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.752502 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:13.780670 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.780648 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:13.783783 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.783765 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:13.801111 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.801087 2571 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:13.806373 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.806357 2571 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:13.807629 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.807610 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:13.808676 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.808657 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:13.812298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.812274 2571 fs.go:135] Filesystem UUIDs: map[466cc28a-c4d7-4f7a-a44e-7d217888e869:/dev/nvme0n1p4 5f49dde5-3519-4141-98b5-d66dd79daa28:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 13:59:13.812365 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.812298 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:13.819088 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.818962 2571 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:13.816724329 +0000 UTC m=+0.446092599 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100870 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2aa554bdeebb2fa6a04006be90120f SystemUUID:ec2aa554-bdee-bb2f-a6a0-4006be90120f BootID:4cd2ed07-0947-4294-a36c-8c35ccc61b8c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c8:f8:a2:b9:6b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c8:f8:a2:b9:6b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:50:54:44:f3:b8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:13.819088 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.819083 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:13.819203 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.819173 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:13.821211 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.821189 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:13.821348 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.821213 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-99.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:13.821400 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.821359 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:13.821400 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.821368 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:13.821400 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.821381 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:13.823883 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.823870 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:13.824942 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.824931 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:13.825081 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.825060 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:13.827326 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.827316 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:13.827368 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.827338 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:13.827368 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.827354 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:13.827368 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.827363 2571 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:13.827530 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.827374 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:13.828507 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.828493 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:13.828553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.828520 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:13.831298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.831282 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:13.832912 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.832898 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:13.834132 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834116 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:13.834197 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834139 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:13.834197 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834149 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:13.834197 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834158 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:13.834197 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834167 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:13.834197 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834175 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:13.834197 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834183 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:13.834197 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834192 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:13.834389 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834202 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:13.834389 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834210 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:13.834389 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834232 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:13.834389 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.834246 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:13.835137 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.835123 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:13.835194 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.835144 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:13.839319 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.839291 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:13.839428 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.839338 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-99.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:13.839428 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.839344 2571 server.go:1295] "Started kubelet" Apr 16 13:59:13.839546 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.839375 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:13.839546 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.839455 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-99.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:13.839546 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.839451 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:13.839546 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.839440 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:13.839546 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.839494 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:13.840172 ip-10-0-131-99 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:13.840478 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.840461 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:13.840742 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.840639 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:13.845334 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.845316 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:13.845919 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.845897 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:13.846730 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.846711 2571 factory.go:55] Registering systemd factory Apr 16 13:59:13.846808 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.846779 2571 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:13.846859 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.845746 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-99.ec2.internal.18a6db0d72157c99 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-99.ec2.internal,UID:ip-10-0-131-99.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-99.ec2.internal,},FirstTimestamp:2026-04-16 13:59:13.839307929 +0000 UTC m=+0.468676199,LastTimestamp:2026-04-16 13:59:13.839307929 +0000 UTC m=+0.468676199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-99.ec2.internal,}" Apr 16 13:59:13.846939 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.846868 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:13.846986 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.846972 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:13.847033 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.846989 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:13.847033 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.846993 2571 factory.go:153] Registering CRI-O factory Apr 16 13:59:13.847033 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.847001 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:13.847033 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.847006 2571 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:13.847223 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.847064 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:13.847223 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.847089 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:13.847223 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.847106 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:13.847223 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.847148 2571 factory.go:103] Registering Raw factory Apr 16 13:59:13.847223 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.847166 2571 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:13.847390 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.847352 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:13.849094 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.847879 2571 manager.go:319] Starting recovery of all containers Apr 16 13:59:13.856267 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.856230 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-df8k7" Apr 16 13:59:13.856954 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.856919 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 13:59:13.857087 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.857024 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-99.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 13:59:13.860731 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.860714 2571 manager.go:324] Recovery completed Apr 16 13:59:13.861280 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.861211 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-df8k7" Apr 16 13:59:13.862302 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.862282 2571 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 13:59:13.865127 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.865114 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:13.867444 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.867430 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:13.867508 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.867457 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:13.867508 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.867467 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:13.867918 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.867901 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:13.867918 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.867915 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:13.868053 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.867930 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:13.870122 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.870001 2571 policy_none.go:49] "None policy: Start" Apr 16 13:59:13.870168 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.870129 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:13.870168 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.870140 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:13.913739 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.913720 2571 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:13.914770 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.913759 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:13.914770 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.913772 2571 server.go:85] "Starting device plugin registration server" Apr 16 13:59:13.914770 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.914031 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:13.914770 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.914046 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:13.914770 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.914170 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:13.914770 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.914268 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:13.914770 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.914280 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:13.915128 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.914844 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:13.915128 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.914884 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:13.949083 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.949036 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:13.950268 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.950249 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:13.950386 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.950277 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:13.950386 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.950296 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:13.950386 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.950303 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:13.950386 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:13.950339 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:13.952717 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:13.952700 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:14.014748 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.014644 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:14.015667 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.015638 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:14.015667 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.015669 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:14.015843 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.015680 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:14.015843 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.015704 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.025670 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.025653 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.025789 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.025676 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-99.ec2.internal\": node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.043682 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.043654 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.051040 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.051020 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal"] Apr 16 13:59:14.051135 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.051125 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:14.052150 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.052133 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:14.052229 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.052171 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:14.052229 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.052193 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:14.053531 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.053518 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:14.053662 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.053648 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.053706 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.053686 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:14.054409 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.054394 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:14.054494 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.054416 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:14.054494 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.054417 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:14.054494 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.054426 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:14.054494 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.054441 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:14.054494 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.054455 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:14.058422 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.058405 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.058523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.058432 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:14.059514 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.059499 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:14.059580 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.059529 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:14.059580 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.059540 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:14.081966 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.081940 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-99.ec2.internal\" not found" node="ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.087148 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.087127 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-99.ec2.internal\" not found" node="ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.144162 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.144132 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.148336 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.148317 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bc3ab7e9751845419db98ea68513295-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal\" (UID: \"1bc3ab7e9751845419db98ea68513295\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.148447 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.148344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bc3ab7e9751845419db98ea68513295-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal\" (UID: \"1bc3ab7e9751845419db98ea68513295\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.148447 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.148361 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f5b5a9d82f93d048857d4c98e90f0fd3-config\") pod \"kube-apiserver-proxy-ip-10-0-131-99.ec2.internal\" (UID: \"f5b5a9d82f93d048857d4c98e90f0fd3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.245039 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.244996 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.249420 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.249398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bc3ab7e9751845419db98ea68513295-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal\" (UID: \"1bc3ab7e9751845419db98ea68513295\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.249502 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.249440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bc3ab7e9751845419db98ea68513295-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal\" (UID: \"1bc3ab7e9751845419db98ea68513295\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.249502 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.249469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f5b5a9d82f93d048857d4c98e90f0fd3-config\") pod \"kube-apiserver-proxy-ip-10-0-131-99.ec2.internal\" (UID: \"f5b5a9d82f93d048857d4c98e90f0fd3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.249502 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.249477 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bc3ab7e9751845419db98ea68513295-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal\" (UID: \"1bc3ab7e9751845419db98ea68513295\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.249619 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.249527 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f5b5a9d82f93d048857d4c98e90f0fd3-config\") pod \"kube-apiserver-proxy-ip-10-0-131-99.ec2.internal\" (UID: \"f5b5a9d82f93d048857d4c98e90f0fd3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.249619 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.249550 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bc3ab7e9751845419db98ea68513295-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal\" (UID: \"1bc3ab7e9751845419db98ea68513295\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.345889 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.345815 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.387293 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.387271 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.389759 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.389739 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal" Apr 16 13:59:14.446536 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.446499 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.546958 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.546928 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.647491 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.647460 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.730946 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.730916 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:14.747991 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.747966 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.752139 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.752125 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:14.752294 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.752269 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:14.752330 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.752304 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:14.845995 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.845969 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:14.848028 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.848009 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.861702 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.861646 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:14.862648 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.862621 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:13 +0000 UTC" deadline="2027-10-13 10:19:39.834960377 +0000 UTC" Apr 16 13:59:14.862725 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.862648 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13076h20m24.972316372s" Apr 16 13:59:14.893063 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:14.893027 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bc3ab7e9751845419db98ea68513295.slice/crio-01442820c243fe8d7d9f1ecaee5544d9de2420cde408d15a6cf4aec359119230 WatchSource:0}: Error finding container 01442820c243fe8d7d9f1ecaee5544d9de2420cde408d15a6cf4aec359119230: Status 404 returned error can't find the container with id 01442820c243fe8d7d9f1ecaee5544d9de2420cde408d15a6cf4aec359119230 Apr 16 13:59:14.893950 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.893934 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-678m9" Apr 16 13:59:14.900655 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.900641 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:14.901798 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.901783 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-678m9" Apr 16 13:59:14.948821 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:14.948766 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:14.953735 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:14.953696 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" event={"ID":"1bc3ab7e9751845419db98ea68513295","Type":"ContainerStarted","Data":"01442820c243fe8d7d9f1ecaee5544d9de2420cde408d15a6cf4aec359119230"} Apr 16 13:59:15.049105 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.049060 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:15.064827 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:15.064790 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b5a9d82f93d048857d4c98e90f0fd3.slice/crio-db1977f44cfac90b2fd2eed1d1178514cb4da537a177a9de4e4928dd4de77222 WatchSource:0}: Error finding container db1977f44cfac90b2fd2eed1d1178514cb4da537a177a9de4e4928dd4de77222: Status 404 returned error can't find the container with id db1977f44cfac90b2fd2eed1d1178514cb4da537a177a9de4e4928dd4de77222 Apr 16 13:59:15.149931 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.149847 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:15.250394 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.250361 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-99.ec2.internal\" not found" Apr 16 13:59:15.286425 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.286394 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:15.336129 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.336106 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:15.347249 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.347225 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" Apr 16 13:59:15.362617 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.362588 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:15.363584 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.363571 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal" Apr 16 13:59:15.372822 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.372798 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:15.828754 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.828711 2571 apiserver.go:52] "Watching apiserver" Apr 16 13:59:15.835586 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.835561 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:15.838542 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.838514 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-zvzcf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv","openshift-image-registry/node-ca-h95cv","openshift-multus/network-metrics-daemon-gg599","openshift-network-diagnostics/network-check-target-ptrgm","openshift-network-operator/iptables-alerter-m57gb","kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal","openshift-cluster-node-tuning-operator/tuned-p4hh7","openshift-dns/node-resolver-l47f6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal","openshift-multus/multus-additional-cni-plugins-52th9","openshift-multus/multus-hr2bh","openshift-ovn-kubernetes/ovnkube-node-4q9n5"] Apr 16 13:59:15.839979 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.839959 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.841831 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.841805 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.842816 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.842790 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.843403 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.843380 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:15.844119 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.844096 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:15.844203 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.844126 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:15.844444 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.844372 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wnt4r\"" Apr 16 13:59:15.844444 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.844415 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:15.844766 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.844615 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:15.844766 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.844636 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:15.844766 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.844662 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mj4m8\"" Apr 16 13:59:15.844766 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.844680 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:15.844766 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.844701 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:15.844766 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.844748 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:15.845105 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.844813 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:15.845442 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.845423 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:15.845537 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.845462 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:15.845786 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.845767 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xbllr\"" Apr 16 13:59:15.845911 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.845874 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:15.845911 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.845895 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.846813 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.846789 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:15.848811 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.848266 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:15.848811 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.848536 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vqjl5\"" Apr 16 13:59:15.848811 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.848692 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:15.848811 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.848762 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:15.851235 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.851208 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.852701 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.852682 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:15.853360 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.853339 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:15.853459 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.853436 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-s8lpb\"" Apr 16 13:59:15.853517 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.853456 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:15.854012 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.853994 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:15.854942 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.854924 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9mjv8\"" Apr 16 13:59:15.855081 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.854944 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:15.855081 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.854926 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:15.855242 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.855223 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:15.855334 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.855228 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.856166 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.856148 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:15.856270 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.856231 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:15.856374 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.856327 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-25hf8\"" Apr 16 13:59:15.857717 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857695 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:15.857811 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857757 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-device-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.857811 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857784 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-etc-selinux\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.857811 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444c36ba-0722-4b97-88e0-a10913a4f6b4-host\") pod \"node-ca-h95cv\" (UID: \"444c36ba-0722-4b97-88e0-a10913a4f6b4\") " pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.857972 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857813 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:15.857972 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vxcj\" (UniqueName: \"kubernetes.io/projected/444c36ba-0722-4b97-88e0-a10913a4f6b4-kube-api-access-6vxcj\") pod \"node-ca-h95cv\" (UID: \"444c36ba-0722-4b97-88e0-a10913a4f6b4\") " pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.857972 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-var-lib-cni-multus\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.857972 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857868 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b9041b28-b0ce-4c85-ab1a-80e6b2609764-host-slash\") pod \"iptables-alerter-m57gb\" (UID: \"b9041b28-b0ce-4c85-ab1a-80e6b2609764\") " pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.857972 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-os-release\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.857972 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857909 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dc274749-ec6f-4398-a91a-e94036d6a048-multus-daemon-config\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.857972 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rksrx\" (UniqueName: \"kubernetes.io/projected/dc274749-ec6f-4398-a91a-e94036d6a048-kube-api-access-rksrx\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.857972 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857946 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:15.857972 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857964 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.857983 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-hostroot\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858003 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b9041b28-b0ce-4c85-ab1a-80e6b2609764-iptables-alerter-script\") pod \"iptables-alerter-m57gb\" (UID: \"b9041b28-b0ce-4c85-ab1a-80e6b2609764\") " pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-run-k8s-cni-cncf-io\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-socket-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl8gc\" (UniqueName: \"kubernetes.io/projected/bf77106c-73b0-4238-861a-09f2f637db71-kube-api-access-dl8gc\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858240 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-n5s5b\"" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858267 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-etc-kubernetes\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858294 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thk8s\" (UniqueName: \"kubernetes.io/projected/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-kube-api-access-thk8s\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dc274749-ec6f-4398-a91a-e94036d6a048-cni-binary-copy\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858337 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858367 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-sys-fs\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858401 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a32af74e-5db6-424e-85ec-f3c363b28eb5-konnectivity-ca\") pod \"konnectivity-agent-zvzcf\" (UID: \"a32af74e-5db6-424e-85ec-f3c363b28eb5\") " pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858430 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-multus-cni-dir\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858437 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858456 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-cnibin\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.858472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/444c36ba-0722-4b97-88e0-a10913a4f6b4-serviceca\") pod \"node-ca-h95cv\" (UID: \"444c36ba-0722-4b97-88e0-a10913a4f6b4\") " pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858522 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7zr\" (UniqueName: \"kubernetes.io/projected/b9041b28-b0ce-4c85-ab1a-80e6b2609764-kube-api-access-zm7zr\") pod \"iptables-alerter-m57gb\" (UID: \"b9041b28-b0ce-4c85-ab1a-80e6b2609764\") " pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858548 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a32af74e-5db6-424e-85ec-f3c363b28eb5-agent-certs\") pod \"konnectivity-agent-zvzcf\" (UID: \"a32af74e-5db6-424e-85ec-f3c363b28eb5\") " pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858602 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-var-lib-cni-bin\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858639 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-var-lib-kubelet\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858661 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-system-cni-dir\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858695 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hgr6q\"" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858710 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-multus-socket-dir-parent\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858734 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-multus-conf-dir\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858757 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-run-multus-certs\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858781 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-run-netns\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858730 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858643 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858806 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.859303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.858839 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-registration-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.902492 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.902456 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:14 +0000 UTC" deadline="2028-01-09 15:37:33.805624162 +0000 UTC" Apr 16 13:59:15.902610 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.902492 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15193h38m17.903136608s" Apr 16 13:59:15.947939 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.947908 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:15.955783 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.955751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal" event={"ID":"f5b5a9d82f93d048857d4c98e90f0fd3","Type":"ContainerStarted","Data":"db1977f44cfac90b2fd2eed1d1178514cb4da537a177a9de4e4928dd4de77222"} Apr 16 13:59:15.959163 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-etc-openvswitch\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.959309 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959168 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-run-ovn\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.959309 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959194 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-run-ovn-kubernetes\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.959309 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959217 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-tmp\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.959309 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959246 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-sys-fs\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.959309 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959272 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-cnibin\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959330 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pppk7\" (UniqueName: \"kubernetes.io/projected/bb45c2a8-3222-492e-a359-cd27a52d6faa-kube-api-access-pppk7\") pod \"node-resolver-l47f6\" (UID: \"bb45c2a8-3222-492e-a359-cd27a52d6faa\") " pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:15.959523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-sys-fs\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.959523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959376 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a32af74e-5db6-424e-85ec-f3c363b28eb5-agent-certs\") pod \"konnectivity-agent-zvzcf\" (UID: \"a32af74e-5db6-424e-85ec-f3c363b28eb5\") " pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:15.959523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959343 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-cnibin\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959405 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-var-lib-cni-bin\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959432 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-var-lib-kubelet\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-var-lib-kubelet\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959486 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-run-systemd\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.959523 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-var-lib-cni-bin\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959604 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-cni-netd\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959654 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb45c2a8-3222-492e-a359-cd27a52d6faa-tmp-dir\") pod \"node-resolver-l47f6\" (UID: \"bb45c2a8-3222-492e-a359-cd27a52d6faa\") " pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-multus-socket-dir-parent\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-run-multus-certs\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959738 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-systemd-units\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-log-socket\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959803 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-multus-socket-dir-parent\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-cni-bin\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959862 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-run-multus-certs\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.959921 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959912 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-cnibin\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959948 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444c36ba-0722-4b97-88e0-a10913a4f6b4-host\") pod \"node-ca-h95cv\" (UID: \"444c36ba-0722-4b97-88e0-a10913a4f6b4\") " pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vxcj\" (UniqueName: \"kubernetes.io/projected/444c36ba-0722-4b97-88e0-a10913a4f6b4-kube-api-access-6vxcj\") pod \"node-ca-h95cv\" (UID: \"444c36ba-0722-4b97-88e0-a10913a4f6b4\") " pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959997 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-run-openvswitch\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444c36ba-0722-4b97-88e0-a10913a4f6b4-host\") pod \"node-ca-h95cv\" (UID: \"444c36ba-0722-4b97-88e0-a10913a4f6b4\") " pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.959970 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960020 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlbxq\" (UniqueName: \"kubernetes.io/projected/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-kube-api-access-zlbxq\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960089 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bb45c2a8-3222-492e-a359-cd27a52d6faa-hosts-file\") pod \"node-resolver-l47f6\" (UID: \"bb45c2a8-3222-492e-a359-cd27a52d6faa\") " pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-os-release\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960130 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-tuning-conf-dir\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960154 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b9041b28-b0ce-4c85-ab1a-80e6b2609764-host-slash\") pod \"iptables-alerter-m57gb\" (UID: \"b9041b28-b0ce-4c85-ab1a-80e6b2609764\") " pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-os-release\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960236 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-os-release\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960267 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rksrx\" (UniqueName: \"kubernetes.io/projected/dc274749-ec6f-4398-a91a-e94036d6a048-kube-api-access-rksrx\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b9041b28-b0ce-4c85-ab1a-80e6b2609764-host-slash\") pod \"iptables-alerter-m57gb\" (UID: \"b9041b28-b0ce-4c85-ab1a-80e6b2609764\") " pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.960337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960303 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-run-netns\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-hostroot\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960473 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-sysctl-conf\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960525 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-hostroot\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-sys\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/497c5497-4184-4e77-90af-4b9edc13fa89-cni-binary-copy\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/497c5497-4184-4e77-90af-4b9edc13fa89-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960653 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-sysctl-d\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960687 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-var-lib-kubelet\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960717 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-system-cni-dir\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxkkm\" (UniqueName: \"kubernetes.io/projected/497c5497-4184-4e77-90af-4b9edc13fa89-kube-api-access-dxkkm\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-socket-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl8gc\" (UniqueName: \"kubernetes.io/projected/bf77106c-73b0-4238-861a-09f2f637db71-kube-api-access-dl8gc\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-etc-kubernetes\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.960892 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/00f5f350-f965-4f31-9400-648a4573f987-ovn-node-metrics-cert\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-socket-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dc274749-ec6f-4398-a91a-e94036d6a048-cni-binary-copy\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a32af74e-5db6-424e-85ec-f3c363b28eb5-konnectivity-ca\") pod \"konnectivity-agent-zvzcf\" (UID: \"a32af74e-5db6-424e-85ec-f3c363b28eb5\") " pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.960962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-etc-kubernetes\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-multus-cni-dir\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-var-lib-openvswitch\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-modprobe-d\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961197 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-tuned\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-multus-cni-dir\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/444c36ba-0722-4b97-88e0-a10913a4f6b4-serviceca\") pod \"node-ca-h95cv\" (UID: \"444c36ba-0722-4b97-88e0-a10913a4f6b4\") " pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7zr\" (UniqueName: \"kubernetes.io/projected/b9041b28-b0ce-4c85-ab1a-80e6b2609764-kube-api-access-zm7zr\") pod \"iptables-alerter-m57gb\" (UID: \"b9041b28-b0ce-4c85-ab1a-80e6b2609764\") " pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-node-log\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/00f5f350-f965-4f31-9400-648a4573f987-ovnkube-config\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961345 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/00f5f350-f965-4f31-9400-648a4573f987-ovnkube-script-lib\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961368 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-systemd\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-system-cni-dir\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.961538 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-multus-conf-dir\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-run-netns\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-registration-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-device-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961525 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-etc-selinux\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961540 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dc274749-ec6f-4398-a91a-e94036d6a048-multus-daemon-config\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961549 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dc274749-ec6f-4398-a91a-e94036d6a048-cni-binary-copy\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961568 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a32af74e-5db6-424e-85ec-f3c363b28eb5-konnectivity-ca\") pod \"konnectivity-agent-zvzcf\" (UID: \"a32af74e-5db6-424e-85ec-f3c363b28eb5\") " pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961557 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l8vp\" (UniqueName: \"kubernetes.io/projected/00f5f350-f965-4f31-9400-648a4573f987-kube-api-access-7l8vp\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-multus-conf-dir\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961617 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-run-netns\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961627 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-var-lib-cni-multus\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961676 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-registration-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-sysconfig\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.962298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961746 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-kubernetes\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-lib-modules\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961774 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-host\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961784 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-etc-selinux\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961788 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/497c5497-4184-4e77-90af-4b9edc13fa89-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961830 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-run\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961724 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-device-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf77106c-73b0-4238-861a-09f2f637db71-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961936 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-var-lib-cni-multus\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.961967 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/444c36ba-0722-4b97-88e0-a10913a4f6b4-serviceca\") pod \"node-ca-h95cv\" (UID: \"444c36ba-0722-4b97-88e0-a10913a4f6b4\") " pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b9041b28-b0ce-4c85-ab1a-80e6b2609764-iptables-alerter-script\") pod \"iptables-alerter-m57gb\" (UID: \"b9041b28-b0ce-4c85-ab1a-80e6b2609764\") " pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.962145 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962192 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-kubelet\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-slash\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dc274749-ec6f-4398-a91a-e94036d6a048-multus-daemon-config\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962251 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-system-cni-dir\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.963036 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.962283 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs podName:97f73dc3-4dcf-4643-8dc6-cd6e6418679b nodeName:}" failed. No retries permitted until 2026-04-16 13:59:16.462267973 +0000 UTC m=+3.091636230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs") pod "network-metrics-daemon-gg599" (UID: "97f73dc3-4dcf-4643-8dc6-cd6e6418679b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:15.963532 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962324 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/00f5f350-f965-4f31-9400-648a4573f987-env-overrides\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:15.963532 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-run-k8s-cni-cncf-io\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.963532 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962415 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thk8s\" (UniqueName: \"kubernetes.io/projected/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-kube-api-access-thk8s\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:15.963532 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962429 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dc274749-ec6f-4398-a91a-e94036d6a048-host-run-k8s-cni-cncf-io\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.963532 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.962731 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b9041b28-b0ce-4c85-ab1a-80e6b2609764-iptables-alerter-script\") pod \"iptables-alerter-m57gb\" (UID: \"b9041b28-b0ce-4c85-ab1a-80e6b2609764\") " pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.964059 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.964037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a32af74e-5db6-424e-85ec-f3c363b28eb5-agent-certs\") pod \"konnectivity-agent-zvzcf\" (UID: \"a32af74e-5db6-424e-85ec-f3c363b28eb5\") " pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:15.969043 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.969016 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl8gc\" (UniqueName: \"kubernetes.io/projected/bf77106c-73b0-4238-861a-09f2f637db71-kube-api-access-dl8gc\") pod \"aws-ebs-csi-driver-node-6dlmv\" (UID: \"bf77106c-73b0-4238-861a-09f2f637db71\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:15.969161 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.969120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rksrx\" (UniqueName: \"kubernetes.io/projected/dc274749-ec6f-4398-a91a-e94036d6a048-kube-api-access-rksrx\") pod \"multus-hr2bh\" (UID: \"dc274749-ec6f-4398-a91a-e94036d6a048\") " pod="openshift-multus/multus-hr2bh" Apr 16 13:59:15.969456 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.969440 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vxcj\" (UniqueName: \"kubernetes.io/projected/444c36ba-0722-4b97-88e0-a10913a4f6b4-kube-api-access-6vxcj\") pod \"node-ca-h95cv\" (UID: \"444c36ba-0722-4b97-88e0-a10913a4f6b4\") " pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:15.975004 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.974984 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:15.975004 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.975005 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:15.975220 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.975015 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vsdpt for pod openshift-network-diagnostics/network-check-target-ptrgm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:15.975220 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:15.975089 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt podName:a45bf770-bb2a-4a8f-8fa8-60cb36789e8c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:16.475059674 +0000 UTC m=+3.104427935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vsdpt" (UniqueName: "kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt") pod "network-check-target-ptrgm" (UID: "a45bf770-bb2a-4a8f-8fa8-60cb36789e8c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:15.977638 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.977612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7zr\" (UniqueName: \"kubernetes.io/projected/b9041b28-b0ce-4c85-ab1a-80e6b2609764-kube-api-access-zm7zr\") pod \"iptables-alerter-m57gb\" (UID: \"b9041b28-b0ce-4c85-ab1a-80e6b2609764\") " pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:15.978770 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.978751 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thk8s\" (UniqueName: \"kubernetes.io/projected/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-kube-api-access-thk8s\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:15.990706 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:15.990677 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:16.063498 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063449 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l8vp\" (UniqueName: \"kubernetes.io/projected/00f5f350-f965-4f31-9400-648a4573f987-kube-api-access-7l8vp\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063676 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063504 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063676 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063552 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-sysconfig\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.063676 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063577 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-kubernetes\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.063676 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-lib-modules\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.063676 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063623 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-host\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.063676 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/497c5497-4184-4e77-90af-4b9edc13fa89-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063686 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-run\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-kubelet\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-slash\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/00f5f350-f965-4f31-9400-648a4573f987-env-overrides\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-etc-openvswitch\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063779 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-run-ovn\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-run-ovn-kubernetes\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063810 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-tmp\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063828 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pppk7\" (UniqueName: \"kubernetes.io/projected/bb45c2a8-3222-492e-a359-cd27a52d6faa-kube-api-access-pppk7\") pod \"node-resolver-l47f6\" (UID: \"bb45c2a8-3222-492e-a359-cd27a52d6faa\") " pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063857 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-run-systemd\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-cni-netd\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb45c2a8-3222-492e-a359-cd27a52d6faa-tmp-dir\") pod \"node-resolver-l47f6\" (UID: \"bb45c2a8-3222-492e-a359-cd27a52d6faa\") " pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063917 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-systemd-units\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.063953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.063938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-log-socket\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-cni-bin\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-cnibin\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064045 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-run-openvswitch\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlbxq\" (UniqueName: \"kubernetes.io/projected/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-kube-api-access-zlbxq\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064114 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bb45c2a8-3222-492e-a359-cd27a52d6faa-hosts-file\") pod \"node-resolver-l47f6\" (UID: \"bb45c2a8-3222-492e-a359-cd27a52d6faa\") " pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-os-release\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-tuning-conf-dir\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-run-ovn-kubernetes\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064286 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-run-systemd\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-run-netns\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-kubernetes\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064347 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-cni-bin\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-systemd-units\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-cnibin\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-run-netns\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.064574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064429 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-run-ovn\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064443 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-sysconfig\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-host\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-kubelet\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bb45c2a8-3222-492e-a359-cd27a52d6faa-hosts-file\") pod \"node-resolver-l47f6\" (UID: \"bb45c2a8-3222-492e-a359-cd27a52d6faa\") " pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064501 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-sysctl-conf\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064520 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-run-openvswitch\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-sys\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/497c5497-4184-4e77-90af-4b9edc13fa89-cni-binary-copy\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-os-release\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.065204 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.064587 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/497c5497-4184-4e77-90af-4b9edc13fa89-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.065608 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.065529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-tuning-conf-dir\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.065652 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.065581 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb45c2a8-3222-492e-a359-cd27a52d6faa-tmp-dir\") pod \"node-resolver-l47f6\" (UID: \"bb45c2a8-3222-492e-a359-cd27a52d6faa\") " pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:16.065829 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.065808 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-lib-modules\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.065878 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.065859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/497c5497-4184-4e77-90af-4b9edc13fa89-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.065878 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.065869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-run\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.065965 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.065860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/00f5f350-f965-4f31-9400-648a4573f987-env-overrides\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.065965 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.065921 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-etc-openvswitch\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.065965 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.065959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-cni-netd\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.066110 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.065995 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-sysctl-d\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.066110 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-log-socket\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.066209 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066145 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/497c5497-4184-4e77-90af-4b9edc13fa89-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.066407 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066374 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-var-lib-kubelet\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.066470 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-host-slash\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.066470 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066440 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-sysctl-conf\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.066561 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-system-cni-dir\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.066561 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066533 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-sysctl-d\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.066561 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066535 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-var-lib-kubelet\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.066707 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066592 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/497c5497-4184-4e77-90af-4b9edc13fa89-system-cni-dir\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.066707 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxkkm\" (UniqueName: \"kubernetes.io/projected/497c5497-4184-4e77-90af-4b9edc13fa89-kube-api-access-dxkkm\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.066707 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-sys\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.066707 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/00f5f350-f965-4f31-9400-648a4573f987-ovn-node-metrics-cert\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.066950 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-var-lib-openvswitch\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.066950 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-modprobe-d\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.066950 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-tuned\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.066950 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-node-log\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.066950 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/497c5497-4184-4e77-90af-4b9edc13fa89-cni-binary-copy\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.067188 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.066987 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-var-lib-openvswitch\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.067188 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.067027 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/00f5f350-f965-4f31-9400-648a4573f987-ovnkube-config\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.067188 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.067041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-modprobe-d\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.067188 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.067056 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/00f5f350-f965-4f31-9400-648a4573f987-node-log\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.067188 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.067108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/00f5f350-f965-4f31-9400-648a4573f987-ovnkube-script-lib\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.067188 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.067156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-systemd\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.067438 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.067319 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-systemd\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.067800 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.067778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/00f5f350-f965-4f31-9400-648a4573f987-ovnkube-script-lib\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.067896 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.067884 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/00f5f350-f965-4f31-9400-648a4573f987-ovnkube-config\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.069337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.068234 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-tmp\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.070054 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.070031 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/00f5f350-f965-4f31-9400-648a4573f987-ovn-node-metrics-cert\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.071109 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.071088 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-etc-tuned\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.073112 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.073089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l8vp\" (UniqueName: \"kubernetes.io/projected/00f5f350-f965-4f31-9400-648a4573f987-kube-api-access-7l8vp\") pod \"ovnkube-node-4q9n5\" (UID: \"00f5f350-f965-4f31-9400-648a4573f987\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.075637 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.075608 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxkkm\" (UniqueName: \"kubernetes.io/projected/497c5497-4184-4e77-90af-4b9edc13fa89-kube-api-access-dxkkm\") pod \"multus-additional-cni-plugins-52th9\" (UID: \"497c5497-4184-4e77-90af-4b9edc13fa89\") " pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.075808 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.075784 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pppk7\" (UniqueName: \"kubernetes.io/projected/bb45c2a8-3222-492e-a359-cd27a52d6faa-kube-api-access-pppk7\") pod \"node-resolver-l47f6\" (UID: \"bb45c2a8-3222-492e-a359-cd27a52d6faa\") " pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:16.075955 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.075931 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlbxq\" (UniqueName: \"kubernetes.io/projected/77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e-kube-api-access-zlbxq\") pod \"tuned-p4hh7\" (UID: \"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e\") " pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.152929 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.152895 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hr2bh" Apr 16 13:59:16.159963 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:16.159931 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc274749_ec6f_4398_a91a_e94036d6a048.slice/crio-0f4839a2ef886283439f4771bc1e331d9281061a87d42bfb7eee4b20ed7ea987 WatchSource:0}: Error finding container 0f4839a2ef886283439f4771bc1e331d9281061a87d42bfb7eee4b20ed7ea987: Status 404 returned error can't find the container with id 0f4839a2ef886283439f4771bc1e331d9281061a87d42bfb7eee4b20ed7ea987 Apr 16 13:59:16.163534 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.163511 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" Apr 16 13:59:16.169517 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.169492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:16.171615 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:16.171572 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf77106c_73b0_4238_861a_09f2f637db71.slice/crio-95e119843cdd1dfc674a78f7347531e362ceb636393cdde1331564112561f25b WatchSource:0}: Error finding container 95e119843cdd1dfc674a78f7347531e362ceb636393cdde1331564112561f25b: Status 404 returned error can't find the container with id 95e119843cdd1dfc674a78f7347531e362ceb636393cdde1331564112561f25b Apr 16 13:59:16.174723 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.174698 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h95cv" Apr 16 13:59:16.177608 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:16.177581 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda32af74e_5db6_424e_85ec_f3c363b28eb5.slice/crio-c078772959b173e8926649de3a60ad4ec1553cc7fd35f3f3fcf2bb4a35c034d0 WatchSource:0}: Error finding container c078772959b173e8926649de3a60ad4ec1553cc7fd35f3f3fcf2bb4a35c034d0: Status 404 returned error can't find the container with id c078772959b173e8926649de3a60ad4ec1553cc7fd35f3f3fcf2bb4a35c034d0 Apr 16 13:59:16.181024 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.180994 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m57gb" Apr 16 13:59:16.183181 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:16.183157 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod444c36ba_0722_4b97_88e0_a10913a4f6b4.slice/crio-ab4fe049dc17be6302fe2651ee62ff8511d3a093a871e5c0400e88b6077ccf1d WatchSource:0}: Error finding container ab4fe049dc17be6302fe2651ee62ff8511d3a093a871e5c0400e88b6077ccf1d: Status 404 returned error can't find the container with id ab4fe049dc17be6302fe2651ee62ff8511d3a093a871e5c0400e88b6077ccf1d Apr 16 13:59:16.187653 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.187631 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" Apr 16 13:59:16.188529 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:16.188509 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9041b28_b0ce_4c85_ab1a_80e6b2609764.slice/crio-47dc564fa3429a4f5d9ed95d4c4b4c9fea7626d785ff3444a0290ed40f7fd022 WatchSource:0}: Error finding container 47dc564fa3429a4f5d9ed95d4c4b4c9fea7626d785ff3444a0290ed40f7fd022: Status 404 returned error can't find the container with id 47dc564fa3429a4f5d9ed95d4c4b4c9fea7626d785ff3444a0290ed40f7fd022 Apr 16 13:59:16.193435 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.193272 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l47f6" Apr 16 13:59:16.197766 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:16.197744 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77f6f93e_5bf6_4049_9cfd_a80a1c8b5a2e.slice/crio-d41931c32b8b5d53b97a871a508cb2d702a07127e9a82b9f0d4b27554d62e326 WatchSource:0}: Error finding container d41931c32b8b5d53b97a871a508cb2d702a07127e9a82b9f0d4b27554d62e326: Status 404 returned error can't find the container with id d41931c32b8b5d53b97a871a508cb2d702a07127e9a82b9f0d4b27554d62e326 Apr 16 13:59:16.199400 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.199378 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-52th9" Apr 16 13:59:16.204137 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.204118 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:16.215998 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:16.215637 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod497c5497_4184_4e77_90af_4b9edc13fa89.slice/crio-3e5e5653bb41cce35c39641315b695d6106fe1e50a30dd9854b3fd59bf4bc636 WatchSource:0}: Error finding container 3e5e5653bb41cce35c39641315b695d6106fe1e50a30dd9854b3fd59bf4bc636: Status 404 returned error can't find the container with id 3e5e5653bb41cce35c39641315b695d6106fe1e50a30dd9854b3fd59bf4bc636 Apr 16 13:59:16.218119 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:16.217809 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00f5f350_f965_4f31_9400_648a4573f987.slice/crio-e6f368a75d93d15fdf90f63764e557b03aa6a34a32e00ccf0f3ad7352a4aa2c2 WatchSource:0}: Error finding container e6f368a75d93d15fdf90f63764e557b03aa6a34a32e00ccf0f3ad7352a4aa2c2: Status 404 returned error can't find the container with id e6f368a75d93d15fdf90f63764e557b03aa6a34a32e00ccf0f3ad7352a4aa2c2 Apr 16 13:59:16.469692 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.469599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:16.469817 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:16.469742 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:16.469817 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:16.469811 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs podName:97f73dc3-4dcf-4643-8dc6-cd6e6418679b nodeName:}" failed. No retries permitted until 2026-04-16 13:59:17.46979265 +0000 UTC m=+4.099160912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs") pod "network-metrics-daemon-gg599" (UID: "97f73dc3-4dcf-4643-8dc6-cd6e6418679b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:16.570925 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.570887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:16.571118 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:16.571042 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:16.571118 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:16.571065 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:16.571118 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:16.571101 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vsdpt for pod openshift-network-diagnostics/network-check-target-ptrgm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:16.571290 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:16.571163 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt podName:a45bf770-bb2a-4a8f-8fa8-60cb36789e8c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:17.571146171 +0000 UTC m=+4.200514427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vsdpt" (UniqueName: "kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt") pod "network-check-target-ptrgm" (UID: "a45bf770-bb2a-4a8f-8fa8-60cb36789e8c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:16.810132 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.809829 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:16.903082 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.903022 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:14 +0000 UTC" deadline="2028-01-29 15:30:02.522041008 +0000 UTC" Apr 16 13:59:16.903082 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.903060 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15673h30m45.618984655s" Apr 16 13:59:16.951614 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.951560 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:16.951799 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:16.951702 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:16.960289 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.960255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m57gb" event={"ID":"b9041b28-b0ce-4c85-ab1a-80e6b2609764","Type":"ContainerStarted","Data":"47dc564fa3429a4f5d9ed95d4c4b4c9fea7626d785ff3444a0290ed40f7fd022"} Apr 16 13:59:16.961661 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.961629 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zvzcf" event={"ID":"a32af74e-5db6-424e-85ec-f3c363b28eb5","Type":"ContainerStarted","Data":"c078772959b173e8926649de3a60ad4ec1553cc7fd35f3f3fcf2bb4a35c034d0"} Apr 16 13:59:16.962970 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.962868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hr2bh" event={"ID":"dc274749-ec6f-4398-a91a-e94036d6a048","Type":"ContainerStarted","Data":"0f4839a2ef886283439f4771bc1e331d9281061a87d42bfb7eee4b20ed7ea987"} Apr 16 13:59:16.964872 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.964845 2571 generic.go:358] "Generic (PLEG): container finished" podID="1bc3ab7e9751845419db98ea68513295" containerID="34c927e0ea61bf340c83565087cb1db6123d11a38fd00b228abda040755007b0" exitCode=0 Apr 16 13:59:16.964974 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.964919 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" event={"ID":"1bc3ab7e9751845419db98ea68513295","Type":"ContainerDied","Data":"34c927e0ea61bf340c83565087cb1db6123d11a38fd00b228abda040755007b0"} Apr 16 13:59:16.966520 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.966495 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l47f6" event={"ID":"bb45c2a8-3222-492e-a359-cd27a52d6faa","Type":"ContainerStarted","Data":"80fd541f23e0b61d94eaf84fee77bdefc55cad95498bc27912beeb37b18e45d1"} Apr 16 13:59:16.967667 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.967642 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h95cv" event={"ID":"444c36ba-0722-4b97-88e0-a10913a4f6b4","Type":"ContainerStarted","Data":"ab4fe049dc17be6302fe2651ee62ff8511d3a093a871e5c0400e88b6077ccf1d"} Apr 16 13:59:16.969803 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.969763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" event={"ID":"bf77106c-73b0-4238-861a-09f2f637db71","Type":"ContainerStarted","Data":"95e119843cdd1dfc674a78f7347531e362ceb636393cdde1331564112561f25b"} Apr 16 13:59:16.971203 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.971152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerStarted","Data":"e6f368a75d93d15fdf90f63764e557b03aa6a34a32e00ccf0f3ad7352a4aa2c2"} Apr 16 13:59:16.972754 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.972730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52th9" event={"ID":"497c5497-4184-4e77-90af-4b9edc13fa89","Type":"ContainerStarted","Data":"3e5e5653bb41cce35c39641315b695d6106fe1e50a30dd9854b3fd59bf4bc636"} Apr 16 13:59:16.974758 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:16.974733 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" event={"ID":"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e","Type":"ContainerStarted","Data":"d41931c32b8b5d53b97a871a508cb2d702a07127e9a82b9f0d4b27554d62e326"} Apr 16 13:59:17.476286 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:17.476251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:17.476480 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:17.476420 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:17.476544 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:17.476485 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs podName:97f73dc3-4dcf-4643-8dc6-cd6e6418679b nodeName:}" failed. No retries permitted until 2026-04-16 13:59:19.476465016 +0000 UTC m=+6.105833278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs") pod "network-metrics-daemon-gg599" (UID: "97f73dc3-4dcf-4643-8dc6-cd6e6418679b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:17.577153 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:17.577105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:17.577339 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:17.577320 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:17.577411 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:17.577344 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:17.577411 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:17.577358 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vsdpt for pod openshift-network-diagnostics/network-check-target-ptrgm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:17.577505 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:17.577413 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt podName:a45bf770-bb2a-4a8f-8fa8-60cb36789e8c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:19.577395079 +0000 UTC m=+6.206763342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vsdpt" (UniqueName: "kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt") pod "network-check-target-ptrgm" (UID: "a45bf770-bb2a-4a8f-8fa8-60cb36789e8c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:17.951119 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:17.950610 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:17.951119 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:17.950733 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:18.951789 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:18.951249 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:18.951789 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:18.951389 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:19.000066 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.000027 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zvzcf" event={"ID":"a32af74e-5db6-424e-85ec-f3c363b28eb5","Type":"ContainerStarted","Data":"f5011d0189def6b82ead13bfda21cf3dfb97651dea92e553e704f972ed51f2e1"} Apr 16 13:59:19.007717 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.007686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" event={"ID":"1bc3ab7e9751845419db98ea68513295","Type":"ContainerStarted","Data":"cbb1dcb60a89563f94ce1c84faa0ccca5cf42a3703aec36552446d530bc4858f"} Apr 16 13:59:19.014308 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.014245 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zvzcf" podStartSLOduration=3.308628552 podStartE2EDuration="5.014229328s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 13:59:16.179628887 +0000 UTC m=+2.808997148" lastFinishedPulling="2026-04-16 13:59:17.885229653 +0000 UTC m=+4.514597924" observedRunningTime="2026-04-16 13:59:19.013999402 +0000 UTC m=+5.643367682" watchObservedRunningTime="2026-04-16 13:59:19.014229328 +0000 UTC m=+5.643597607" Apr 16 13:59:19.017957 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.017928 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal" event={"ID":"f5b5a9d82f93d048857d4c98e90f0fd3","Type":"ContainerStarted","Data":"9c77c28f035341abd27f9ce5cd0f8135561b449fdc2406721f5fc242401555ce"} Apr 16 13:59:19.027282 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.026544 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-99.ec2.internal" podStartSLOduration=4.026528969 podStartE2EDuration="4.026528969s" podCreationTimestamp="2026-04-16 13:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:19.026264182 +0000 UTC m=+5.655632461" watchObservedRunningTime="2026-04-16 13:59:19.026528969 +0000 UTC m=+5.655897248" Apr 16 13:59:19.336010 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.335638 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:19.336862 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.336666 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:19.351036 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.349817 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-99.ec2.internal" podStartSLOduration=4.349784123 podStartE2EDuration="4.349784123s" podCreationTimestamp="2026-04-16 13:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:19.038713692 +0000 UTC m=+5.668081974" watchObservedRunningTime="2026-04-16 13:59:19.349784123 +0000 UTC m=+5.979152415" Apr 16 13:59:19.494268 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.494232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:19.494457 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:19.494403 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:19.494514 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:19.494466 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs podName:97f73dc3-4dcf-4643-8dc6-cd6e6418679b nodeName:}" failed. No retries permitted until 2026-04-16 13:59:23.494446659 +0000 UTC m=+10.123814933 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs") pod "network-metrics-daemon-gg599" (UID: "97f73dc3-4dcf-4643-8dc6-cd6e6418679b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:19.594686 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.594605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:19.594845 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:19.594781 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:19.594845 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:19.594801 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:19.594845 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:19.594813 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vsdpt for pod openshift-network-diagnostics/network-check-target-ptrgm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:19.594998 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:19.594873 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt podName:a45bf770-bb2a-4a8f-8fa8-60cb36789e8c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:23.59485318 +0000 UTC m=+10.224221461 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vsdpt" (UniqueName: "kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt") pod "network-check-target-ptrgm" (UID: "a45bf770-bb2a-4a8f-8fa8-60cb36789e8c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:19.951623 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:19.951542 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:19.951788 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:19.951672 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:20.950792 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:20.950751 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:20.951234 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:20.950909 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:21.021684 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:21.021655 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:21.951219 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:21.951183 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:21.951639 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:21.951366 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:22.951554 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:22.951175 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:22.951554 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:22.951326 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:23.529498 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:23.529416 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:23.529676 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:23.529593 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:23.529757 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:23.529678 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs podName:97f73dc3-4dcf-4643-8dc6-cd6e6418679b nodeName:}" failed. No retries permitted until 2026-04-16 13:59:31.529655709 +0000 UTC m=+18.159023980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs") pod "network-metrics-daemon-gg599" (UID: "97f73dc3-4dcf-4643-8dc6-cd6e6418679b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:23.630455 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:23.629780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:23.630455 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:23.629969 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:23.630455 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:23.629989 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:23.630455 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:23.630002 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vsdpt for pod openshift-network-diagnostics/network-check-target-ptrgm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:23.630455 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:23.630060 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt podName:a45bf770-bb2a-4a8f-8fa8-60cb36789e8c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:31.630041241 +0000 UTC m=+18.259409514 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vsdpt" (UniqueName: "kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt") pod "network-check-target-ptrgm" (UID: "a45bf770-bb2a-4a8f-8fa8-60cb36789e8c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:23.912090 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:23.911979 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:23.912237 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:23.912147 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:23.913122 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:23.913098 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zvzcf" Apr 16 13:59:23.951555 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:23.951526 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:23.951728 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:23.951634 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:24.004633 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.004196 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bcp9m"] Apr 16 13:59:24.010152 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.010128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.010301 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:24.010212 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:24.134215 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.134149 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-dbus\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.134393 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.134325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-kubelet-config\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.134393 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.134353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.236111 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.235478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-kubelet-config\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.236111 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.235523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.236111 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.235567 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-dbus\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.236111 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.235611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-kubelet-config\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.236111 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:24.235695 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:24.236111 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.235746 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-dbus\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.236111 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:24.235761 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret podName:83635cb4-c000-4ff0-8ff5-171c0a1c00d0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.735742054 +0000 UTC m=+11.365110331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret") pod "global-pull-secret-syncer-bcp9m" (UID: "83635cb4-c000-4ff0-8ff5-171c0a1c00d0") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:24.739983 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.739950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:24.740172 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:24.740144 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:24.740237 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:24.740230 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret podName:83635cb4-c000-4ff0-8ff5-171c0a1c00d0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:25.740208936 +0000 UTC m=+12.369577209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret") pod "global-pull-secret-syncer-bcp9m" (UID: "83635cb4-c000-4ff0-8ff5-171c0a1c00d0") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:24.951246 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:24.951209 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:24.951403 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:24.951358 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:25.747471 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:25.747431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:25.747875 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:25.747578 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:25.747875 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:25.747639 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret podName:83635cb4-c000-4ff0-8ff5-171c0a1c00d0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:27.747622481 +0000 UTC m=+14.376990741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret") pod "global-pull-secret-syncer-bcp9m" (UID: "83635cb4-c000-4ff0-8ff5-171c0a1c00d0") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:25.951376 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:25.951341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:25.951529 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:25.951472 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:25.951894 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:25.951879 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:25.951997 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:25.951977 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:26.951198 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:26.951159 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:26.951574 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:26.951316 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:27.764061 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:27.764025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:27.764269 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:27.764174 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:27.764269 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:27.764249 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret podName:83635cb4-c000-4ff0-8ff5-171c0a1c00d0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:31.764227444 +0000 UTC m=+18.393595712 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret") pod "global-pull-secret-syncer-bcp9m" (UID: "83635cb4-c000-4ff0-8ff5-171c0a1c00d0") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:27.950669 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:27.950633 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:27.950841 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:27.950646 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:27.950841 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:27.950773 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:27.950964 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:27.950872 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:28.951247 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:28.951213 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:28.951697 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:28.951354 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:29.951286 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:29.951256 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:29.951738 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:29.951411 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:29.951738 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:29.951505 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:29.951738 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:29.951591 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:30.950968 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:30.950935 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:30.951161 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:30.951063 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:31.591394 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:31.591350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:31.591863 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.591532 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:31.591863 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.591603 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs podName:97f73dc3-4dcf-4643-8dc6-cd6e6418679b nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.591585034 +0000 UTC m=+34.220953306 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs") pod "network-metrics-daemon-gg599" (UID: "97f73dc3-4dcf-4643-8dc6-cd6e6418679b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:31.691916 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:31.691877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:31.692116 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.692052 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:31.692116 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.692093 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:31.692116 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.692106 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vsdpt for pod openshift-network-diagnostics/network-check-target-ptrgm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:31.692279 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.692165 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt podName:a45bf770-bb2a-4a8f-8fa8-60cb36789e8c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.692144499 +0000 UTC m=+34.321512758 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vsdpt" (UniqueName: "kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt") pod "network-check-target-ptrgm" (UID: "a45bf770-bb2a-4a8f-8fa8-60cb36789e8c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:31.792714 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:31.792683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:31.792886 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.792797 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:31.792886 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.792855 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret podName:83635cb4-c000-4ff0-8ff5-171c0a1c00d0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.792837102 +0000 UTC m=+26.422205361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret") pod "global-pull-secret-syncer-bcp9m" (UID: "83635cb4-c000-4ff0-8ff5-171c0a1c00d0") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:31.950689 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:31.950619 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:31.950689 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:31.950653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:31.950925 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.950750 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:31.950925 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:31.950876 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:32.951376 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:32.951342 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:32.951736 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:32.951453 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:33.952303 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:33.951895 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:33.953148 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:33.951972 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:33.953148 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:33.952405 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:33.953148 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:33.952476 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:34.051453 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.051051 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hr2bh" event={"ID":"dc274749-ec6f-4398-a91a-e94036d6a048","Type":"ContainerStarted","Data":"a91fe122bb246e736d91387f40903825d24a3605969cea7994b267744f2b3211"} Apr 16 13:59:34.052842 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.052817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l47f6" event={"ID":"bb45c2a8-3222-492e-a359-cd27a52d6faa","Type":"ContainerStarted","Data":"dc3e92c9e25095dc7cdad0d8f4eec252848371abcc9797ff6720d9cbcc3ecc0d"} Apr 16 13:59:34.054255 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.054216 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h95cv" event={"ID":"444c36ba-0722-4b97-88e0-a10913a4f6b4","Type":"ContainerStarted","Data":"8b2ad9e8bb93728a8550c0c4741a73353b0503f28200ea3de575d0291362b488"} Apr 16 13:59:34.055826 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.055801 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" event={"ID":"bf77106c-73b0-4238-861a-09f2f637db71","Type":"ContainerStarted","Data":"ce0c39641a603b34ed33a24c133fa91a458e2125064547b12acdcfedeb7b051b"} Apr 16 13:59:34.058597 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.058574 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 13:59:34.058924 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.058900 2571 generic.go:358] "Generic (PLEG): container finished" podID="00f5f350-f965-4f31-9400-648a4573f987" containerID="6d57a0d3651c8b6c1e1f9ed384e839f8bb6d33e7135f184f0fd24f849990d82f" exitCode=1 Apr 16 13:59:34.059007 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.058982 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerStarted","Data":"10494fd8327e53e6163ef13a332ea051667555d5e18dc481122359113f815507"} Apr 16 13:59:34.059093 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.059012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerStarted","Data":"925cfb6a0edbae2ad6053415da2008f24e9dc40ae8d5f01580060f6cc6ebf6ad"} Apr 16 13:59:34.059093 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.059028 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerStarted","Data":"218153456f1004e5fed6447854198917a0bbbf3f4a510830fa1dbe57766c393c"} Apr 16 13:59:34.059093 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.059042 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerDied","Data":"6d57a0d3651c8b6c1e1f9ed384e839f8bb6d33e7135f184f0fd24f849990d82f"} Apr 16 13:59:34.059093 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.059058 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerStarted","Data":"766c55220a227d78f25553c3341b7080fd34b1782ca2915808bacd30e0adc641"} Apr 16 13:59:34.060451 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.060411 2571 generic.go:358] "Generic (PLEG): container finished" podID="497c5497-4184-4e77-90af-4b9edc13fa89" containerID="f183e9758f7ef40ca2f99397b1aaa994c68f0d25a33d5b2feb183d5c08f63ca8" exitCode=0 Apr 16 13:59:34.060546 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.060480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52th9" event={"ID":"497c5497-4184-4e77-90af-4b9edc13fa89","Type":"ContainerDied","Data":"f183e9758f7ef40ca2f99397b1aaa994c68f0d25a33d5b2feb183d5c08f63ca8"} Apr 16 13:59:34.061808 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.061741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" event={"ID":"77f6f93e-5bf6-4049-9cfd-a80a1c8b5a2e","Type":"ContainerStarted","Data":"1d142c019ffafa447b89f1157e95e2f71461a7beefa03b17017c3434e4bbbf12"} Apr 16 13:59:34.070312 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.070266 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hr2bh" podStartSLOduration=3.08204278 podStartE2EDuration="20.070253148s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 13:59:16.161854541 +0000 UTC m=+2.791222806" lastFinishedPulling="2026-04-16 13:59:33.150064909 +0000 UTC m=+19.779433174" observedRunningTime="2026-04-16 13:59:34.069962684 +0000 UTC m=+20.699330978" watchObservedRunningTime="2026-04-16 13:59:34.070253148 +0000 UTC m=+20.699621428" Apr 16 13:59:34.100028 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.099973 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-p4hh7" podStartSLOduration=3.176342735 podStartE2EDuration="20.099955367s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 13:59:16.201735998 +0000 UTC m=+2.831104255" lastFinishedPulling="2026-04-16 13:59:33.12534862 +0000 UTC m=+19.754716887" observedRunningTime="2026-04-16 13:59:34.099771177 +0000 UTC m=+20.729139459" watchObservedRunningTime="2026-04-16 13:59:34.099955367 +0000 UTC m=+20.729323648" Apr 16 13:59:34.100576 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.100548 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h95cv" podStartSLOduration=3.174958098 podStartE2EDuration="20.100541019s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 13:59:16.185606256 +0000 UTC m=+2.814974518" lastFinishedPulling="2026-04-16 13:59:33.11118917 +0000 UTC m=+19.740557439" observedRunningTime="2026-04-16 13:59:34.083782636 +0000 UTC m=+20.713150918" watchObservedRunningTime="2026-04-16 13:59:34.100541019 +0000 UTC m=+20.729909314" Apr 16 13:59:34.117902 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.117850 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l47f6" podStartSLOduration=3.206167234 podStartE2EDuration="20.117834908s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 13:59:16.211371043 +0000 UTC m=+2.840739300" lastFinishedPulling="2026-04-16 13:59:33.123038705 +0000 UTC m=+19.752406974" observedRunningTime="2026-04-16 13:59:34.117792312 +0000 UTC m=+20.747160590" watchObservedRunningTime="2026-04-16 13:59:34.117834908 +0000 UTC m=+20.747203187" Apr 16 13:59:34.737644 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.737618 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:34.926201 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.926100 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:34.737637925Z","UUID":"aba16c2f-29c5-47b8-8b41-0dcd18139219","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:34.929714 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.929686 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:34.929861 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.929733 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:34.950850 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:34.950826 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:34.951005 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:34.950965 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:35.067064 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:35.066983 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 13:59:35.067528 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:35.067385 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerStarted","Data":"812c75ea81d35d95becd41f07b8153fc190782b982c67ea755e7021baaaa91d9"} Apr 16 13:59:35.068889 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:35.068862 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m57gb" event={"ID":"b9041b28-b0ce-4c85-ab1a-80e6b2609764","Type":"ContainerStarted","Data":"a3f7c5cc1b04b8bca018739b4f18856cc4b473a5c4df7410b83e3cc0bdd53535"} Apr 16 13:59:35.070516 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:35.070468 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" event={"ID":"bf77106c-73b0-4238-861a-09f2f637db71","Type":"ContainerStarted","Data":"db795da9cdcac29c6d45dcf78ce6c39572adb2f0327b098e2c4c871cff7af69d"} Apr 16 13:59:35.085182 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:35.085133 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m57gb" podStartSLOduration=4.165677919 podStartE2EDuration="21.085120686s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 13:59:16.191751906 +0000 UTC m=+2.821120177" lastFinishedPulling="2026-04-16 13:59:33.111194682 +0000 UTC m=+19.740562944" observedRunningTime="2026-04-16 13:59:35.084642927 +0000 UTC m=+21.714011207" watchObservedRunningTime="2026-04-16 13:59:35.085120686 +0000 UTC m=+21.714488965" Apr 16 13:59:35.950746 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:35.950703 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:35.950960 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:35.950753 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:35.950960 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:35.950833 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:35.950960 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:35.950949 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:36.951432 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:36.951392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:36.951915 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:36.951520 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:37.076106 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:37.076042 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" event={"ID":"bf77106c-73b0-4238-861a-09f2f637db71","Type":"ContainerStarted","Data":"2c9666a902071b687587917d35987e2250cdcfc922845dd4c857c10516e34255"} Apr 16 13:59:37.079082 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:37.079052 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 13:59:37.079417 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:37.079393 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerStarted","Data":"189eb9668a19b462b256df5b8a58c45f86d75a4705e0f2a7d07a496d15c76e1e"} Apr 16 13:59:37.951318 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:37.951280 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:37.951498 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:37.951330 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:37.951498 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:37.951408 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:37.951901 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:37.951550 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:38.950654 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:38.950474 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:38.950795 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:38.950733 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:39.085269 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.085195 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 13:59:39.086029 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.085550 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerStarted","Data":"38412379afae302c3a55f4055b340d31bb8863a7c8be918c22e2b9ee8ab30f31"} Apr 16 13:59:39.086029 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.085818 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:39.086029 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.086020 2571 scope.go:117] "RemoveContainer" containerID="6d57a0d3651c8b6c1e1f9ed384e839f8bb6d33e7135f184f0fd24f849990d82f" Apr 16 13:59:39.087290 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.087268 2571 generic.go:358] "Generic (PLEG): container finished" podID="497c5497-4184-4e77-90af-4b9edc13fa89" containerID="803fd957bc4f1db6e757f6e8f0d62668489c84da04c93797f940fc050b8028ff" exitCode=0 Apr 16 13:59:39.087413 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.087305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52th9" event={"ID":"497c5497-4184-4e77-90af-4b9edc13fa89","Type":"ContainerDied","Data":"803fd957bc4f1db6e757f6e8f0d62668489c84da04c93797f940fc050b8028ff"} Apr 16 13:59:39.101655 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.101632 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:39.135396 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.135358 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6dlmv" podStartSLOduration=5.213751398 podStartE2EDuration="25.135343833s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 13:59:16.173280667 +0000 UTC m=+2.802648927" lastFinishedPulling="2026-04-16 13:59:36.094873103 +0000 UTC m=+22.724241362" observedRunningTime="2026-04-16 13:59:37.093429121 +0000 UTC m=+23.722797400" watchObservedRunningTime="2026-04-16 13:59:39.135343833 +0000 UTC m=+25.764712148" Apr 16 13:59:39.844021 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.843837 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:39.844201 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:39.843987 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:39.844201 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:39.844125 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret podName:83635cb4-c000-4ff0-8ff5-171c0a1c00d0 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.844109139 +0000 UTC m=+42.473477401 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret") pod "global-pull-secret-syncer-bcp9m" (UID: "83635cb4-c000-4ff0-8ff5-171c0a1c00d0") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:39.950908 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.950871 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:39.951087 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:39.950921 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:39.951087 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:39.951010 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:39.951202 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:39.951159 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:40.093132 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.093107 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 13:59:40.093606 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.093484 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" event={"ID":"00f5f350-f965-4f31-9400-648a4573f987","Type":"ContainerStarted","Data":"23d125d5186cb57931f536f7a66d4d25c349344209488e7a6eca3596036767f3"} Apr 16 13:59:40.093782 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.093753 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:40.093854 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.093795 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:40.109980 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.109886 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 13:59:40.128857 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.128809 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" podStartSLOduration=9.169327999 podStartE2EDuration="26.128792814s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 13:59:16.22041214 +0000 UTC m=+2.849780400" lastFinishedPulling="2026-04-16 13:59:33.179876959 +0000 UTC m=+19.809245215" observedRunningTime="2026-04-16 13:59:40.128571587 +0000 UTC m=+26.757939865" watchObservedRunningTime="2026-04-16 13:59:40.128792814 +0000 UTC m=+26.758161097" Apr 16 13:59:40.337240 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.337214 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ptrgm"] Apr 16 13:59:40.337377 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.337326 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:40.337456 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:40.337434 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:40.341399 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.341350 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bcp9m"] Apr 16 13:59:40.341514 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.341490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:40.341620 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:40.341593 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:40.342055 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.342022 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gg599"] Apr 16 13:59:40.342176 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:40.342161 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:40.342300 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:40.342275 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:41.097393 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:41.097362 2571 generic.go:358] "Generic (PLEG): container finished" podID="497c5497-4184-4e77-90af-4b9edc13fa89" containerID="19cb5030b3455d54c9bc60da5cf702951ba92e2560ed4a175d5af45049c5aaad" exitCode=0 Apr 16 13:59:41.097768 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:41.097455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52th9" event={"ID":"497c5497-4184-4e77-90af-4b9edc13fa89","Type":"ContainerDied","Data":"19cb5030b3455d54c9bc60da5cf702951ba92e2560ed4a175d5af45049c5aaad"} Apr 16 13:59:41.950660 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:41.950629 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:41.950806 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:41.950628 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:41.950806 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:41.950732 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:41.950806 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:41.950638 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:41.950957 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:41.950808 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:41.950957 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:41.950886 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:43.102858 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:43.102826 2571 generic.go:358] "Generic (PLEG): container finished" podID="497c5497-4184-4e77-90af-4b9edc13fa89" containerID="2f3f516d960f6dc61c9f079f48a8c78c4ee041ed95a560ac9b63d7aaff228d7d" exitCode=0 Apr 16 13:59:43.103271 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:43.102877 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52th9" event={"ID":"497c5497-4184-4e77-90af-4b9edc13fa89","Type":"ContainerDied","Data":"2f3f516d960f6dc61c9f079f48a8c78c4ee041ed95a560ac9b63d7aaff228d7d"} Apr 16 13:59:43.951225 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:43.951198 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:43.951385 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:43.951305 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:43.951385 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:43.951349 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:43.951472 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:43.951392 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:43.951472 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:43.951422 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:43.951541 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:43.951470 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:45.951030 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:45.950993 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:45.951030 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:45.951009 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:45.951516 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:45.951160 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 13:59:45.951516 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:45.951215 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bcp9m" podUID="83635cb4-c000-4ff0-8ff5-171c0a1c00d0" Apr 16 13:59:45.951516 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:45.951246 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:45.951516 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:45.951311 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ptrgm" podUID="a45bf770-bb2a-4a8f-8fa8-60cb36789e8c" Apr 16 13:59:47.218261 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.217944 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-99.ec2.internal" event="NodeReady" Apr 16 13:59:47.218722 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.218351 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:47.254381 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.254352 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn"] Apr 16 13:59:47.277496 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.277442 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54c49c4f59-mqzlz"] Apr 16 13:59:47.277659 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.277626 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" Apr 16 13:59:47.283603 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.280788 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-zlrcf\"" Apr 16 13:59:47.283603 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.280953 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 13:59:47.283603 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.281352 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 13:59:47.283603 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.281597 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 13:59:47.283603 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.283463 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 13:59:47.295599 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.295549 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd"] Apr 16 13:59:47.295739 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.295713 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.298540 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.298509 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 13:59:47.298650 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.298596 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jfbd2\"" Apr 16 13:59:47.298650 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.298599 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 13:59:47.298650 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.298622 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 13:59:47.306059 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.306037 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 13:59:47.313062 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.313038 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm"] Apr 16 13:59:47.313241 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.313227 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.316404 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.316383 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 13:59:47.316517 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.316486 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 13:59:47.316581 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.316532 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 13:59:47.316652 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.316634 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 13:59:47.331305 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.331279 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn"] Apr 16 13:59:47.331305 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.331311 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm"] Apr 16 13:59:47.331484 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.331431 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.331484 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.331439 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd"] Apr 16 13:59:47.331484 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.331464 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54c49c4f59-mqzlz"] Apr 16 13:59:47.331484 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.331481 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wjbcm"] Apr 16 13:59:47.334435 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.334396 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 13:59:47.343175 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.343139 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wjbcm"] Apr 16 13:59:47.343301 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.343269 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:47.345910 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.345886 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:47.346045 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.345914 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:47.346187 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.346168 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-r2bfh\"" Apr 16 13:59:47.346250 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.346202 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:47.373369 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.373347 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v72fr"] Apr 16 13:59:47.386835 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.386810 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v72fr"] Apr 16 13:59:47.386965 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.386948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.390010 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.389989 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5ngv\"" Apr 16 13:59:47.390404 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.390377 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:47.390502 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.390460 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:47.397302 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397276 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.397407 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397317 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcc7w\" (UniqueName: \"kubernetes.io/projected/a79fd531-9d97-4643-9f14-87092aca16e5-kube-api-access-bcc7w\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.397407 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.397517 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-ca\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.397517 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397457 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-installation-pull-secrets\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.397517 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-trusted-ca\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.397517 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397511 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-bound-sa-token\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.397699 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397594 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/490046cd-2f06-4014-b0d3-7662ed1b4f8e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68f559f9c9-d5mjn\" (UID: \"490046cd-2f06-4014-b0d3-7662ed1b4f8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" Apr 16 13:59:47.397699 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397665 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w86pw\" (UniqueName: \"kubernetes.io/projected/490046cd-2f06-4014-b0d3-7662ed1b4f8e-kube-api-access-w86pw\") pod \"managed-serviceaccount-addon-agent-68f559f9c9-d5mjn\" (UID: \"490046cd-2f06-4014-b0d3-7662ed1b4f8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" Apr 16 13:59:47.397699 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397691 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-image-registry-private-configuration\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.397802 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqh2c\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-kube-api-access-mqh2c\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.397802 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397745 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77410159-0fa9-45da-acb5-37356380ab25-tmp\") pod \"klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm\" (UID: \"77410159-0fa9-45da-acb5-37356380ab25\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.397802 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397772 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thgvr\" (UniqueName: \"kubernetes.io/projected/77410159-0fa9-45da-acb5-37356380ab25-kube-api-access-thgvr\") pod \"klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm\" (UID: \"77410159-0fa9-45da-acb5-37356380ab25\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.397944 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.397944 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-certificates\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.397944 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ec9d172-ec00-458f-b2d6-85404e6b97bf-ca-trust-extracted\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.398107 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397967 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-hub\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.398107 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.397999 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/77410159-0fa9-45da-acb5-37356380ab25-klusterlet-config\") pod \"klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm\" (UID: \"77410159-0fa9-45da-acb5-37356380ab25\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.398107 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.398030 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a79fd531-9d97-4643-9f14-87092aca16e5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.498746 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/77410159-0fa9-45da-acb5-37356380ab25-klusterlet-config\") pod \"klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm\" (UID: \"77410159-0fa9-45da-acb5-37356380ab25\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.498746 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a79fd531-9d97-4643-9f14-87092aca16e5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.498746 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcc7w\" (UniqueName: \"kubernetes.io/projected/a79fd531-9d97-4643-9f14-87092aca16e5-kube-api-access-bcc7w\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498800 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xtwq\" (UniqueName: \"kubernetes.io/projected/eaab260d-b8fe-47b6-8446-b16d19857d43-kube-api-access-7xtwq\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498827 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-ca\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498859 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4xkx\" (UniqueName: \"kubernetes.io/projected/aa74ab6f-55fb-4757-9677-130c7dc8c62c-kube-api-access-q4xkx\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.498905 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-installation-pull-secrets\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.498923 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c49c4f59-mqzlz: secret "image-registry-tls" not found Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498942 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-trusted-ca\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.498963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-bound-sa-token\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.499016 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.498986 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls podName:8ec9d172-ec00-458f-b2d6-85404e6b97bf nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.99896482 +0000 UTC m=+34.628333097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls") pod "image-registry-54c49c4f59-mqzlz" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf") : secret "image-registry-tls" not found Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499052 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/490046cd-2f06-4014-b0d3-7662ed1b4f8e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68f559f9c9-d5mjn\" (UID: \"490046cd-2f06-4014-b0d3-7662ed1b4f8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499099 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w86pw\" (UniqueName: \"kubernetes.io/projected/490046cd-2f06-4014-b0d3-7662ed1b4f8e-kube-api-access-w86pw\") pod \"managed-serviceaccount-addon-agent-68f559f9c9-d5mjn\" (UID: \"490046cd-2f06-4014-b0d3-7662ed1b4f8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-image-registry-private-configuration\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqh2c\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-kube-api-access-mqh2c\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77410159-0fa9-45da-acb5-37356380ab25-tmp\") pod \"klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm\" (UID: \"77410159-0fa9-45da-acb5-37356380ab25\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thgvr\" (UniqueName: \"kubernetes.io/projected/77410159-0fa9-45da-acb5-37356380ab25-kube-api-access-thgvr\") pod \"klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm\" (UID: \"77410159-0fa9-45da-acb5-37356380ab25\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499240 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-certificates\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499308 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eaab260d-b8fe-47b6-8446-b16d19857d43-tmp-dir\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499332 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ec9d172-ec00-458f-b2d6-85404e6b97bf-ca-trust-extracted\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499390 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaab260d-b8fe-47b6-8446-b16d19857d43-config-volume\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.499553 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499426 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-hub\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.500206 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499852 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a79fd531-9d97-4643-9f14-87092aca16e5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.500206 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.499954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77410159-0fa9-45da-acb5-37356380ab25-tmp\") pod \"klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm\" (UID: \"77410159-0fa9-45da-acb5-37356380ab25\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.500544 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.500514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-trusted-ca\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.500974 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.500948 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-certificates\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.501452 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.501261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ec9d172-ec00-458f-b2d6-85404e6b97bf-ca-trust-extracted\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.504438 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.504393 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-image-registry-private-configuration\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.504438 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.504428 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.504611 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.504455 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/490046cd-2f06-4014-b0d3-7662ed1b4f8e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68f559f9c9-d5mjn\" (UID: \"490046cd-2f06-4014-b0d3-7662ed1b4f8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" Apr 16 13:59:47.504611 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.504529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-hub\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.504611 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.504555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/77410159-0fa9-45da-acb5-37356380ab25-klusterlet-config\") pod \"klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm\" (UID: \"77410159-0fa9-45da-acb5-37356380ab25\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.505136 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.505097 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-installation-pull-secrets\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.505428 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.505392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.505750 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.505728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a79fd531-9d97-4643-9f14-87092aca16e5-ca\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.511953 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.511924 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcc7w\" (UniqueName: \"kubernetes.io/projected/a79fd531-9d97-4643-9f14-87092aca16e5-kube-api-access-bcc7w\") pod \"cluster-proxy-proxy-agent-5847b58c87-sgngd\" (UID: \"a79fd531-9d97-4643-9f14-87092aca16e5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.514574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.514451 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thgvr\" (UniqueName: \"kubernetes.io/projected/77410159-0fa9-45da-acb5-37356380ab25-kube-api-access-thgvr\") pod \"klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm\" (UID: \"77410159-0fa9-45da-acb5-37356380ab25\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.514574 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.514452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w86pw\" (UniqueName: \"kubernetes.io/projected/490046cd-2f06-4014-b0d3-7662ed1b4f8e-kube-api-access-w86pw\") pod \"managed-serviceaccount-addon-agent-68f559f9c9-d5mjn\" (UID: \"490046cd-2f06-4014-b0d3-7662ed1b4f8e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" Apr 16 13:59:47.525626 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.525596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-bound-sa-token\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.525792 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.525766 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqh2c\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-kube-api-access-mqh2c\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:47.600797 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.600755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:47.600982 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.600821 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eaab260d-b8fe-47b6-8446-b16d19857d43-tmp-dir\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.600982 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.600867 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:47.600982 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.600897 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaab260d-b8fe-47b6-8446-b16d19857d43-config-volume\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.600982 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.600921 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:47.600982 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.600948 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xtwq\" (UniqueName: \"kubernetes.io/projected/eaab260d-b8fe-47b6-8446-b16d19857d43-kube-api-access-7xtwq\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.600982 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.600985 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs podName:97f73dc3-4dcf-4643-8dc6-cd6e6418679b nodeName:}" failed. No retries permitted until 2026-04-16 14:00:19.600967491 +0000 UTC m=+66.230335761 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs") pod "network-metrics-daemon-gg599" (UID: "97f73dc3-4dcf-4643-8dc6-cd6e6418679b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:47.601337 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.601027 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:47.601337 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.601117 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert podName:aa74ab6f-55fb-4757-9677-130c7dc8c62c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:48.101098871 +0000 UTC m=+34.730467134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert") pod "ingress-canary-wjbcm" (UID: "aa74ab6f-55fb-4757-9677-130c7dc8c62c") : secret "canary-serving-cert" not found Apr 16 13:59:47.601337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.601201 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.601337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.601239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4xkx\" (UniqueName: \"kubernetes.io/projected/aa74ab6f-55fb-4757-9677-130c7dc8c62c-kube-api-access-q4xkx\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:47.601337 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.601263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eaab260d-b8fe-47b6-8446-b16d19857d43-tmp-dir\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.601337 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.601282 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:47.601587 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.601348 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls podName:eaab260d-b8fe-47b6-8446-b16d19857d43 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:48.101332495 +0000 UTC m=+34.730700762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls") pod "dns-default-v72fr" (UID: "eaab260d-b8fe-47b6-8446-b16d19857d43") : secret "dns-default-metrics-tls" not found Apr 16 13:59:47.601587 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.601522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaab260d-b8fe-47b6-8446-b16d19857d43-config-volume\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.603728 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.603706 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" Apr 16 13:59:47.613968 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.613936 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4xkx\" (UniqueName: \"kubernetes.io/projected/aa74ab6f-55fb-4757-9677-130c7dc8c62c-kube-api-access-q4xkx\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:47.618884 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.618856 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xtwq\" (UniqueName: \"kubernetes.io/projected/eaab260d-b8fe-47b6-8446-b16d19857d43-kube-api-access-7xtwq\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:47.623040 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.622975 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 13:59:47.656605 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.656573 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:47.701881 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.701847 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:47.702062 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.702017 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:47.702062 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.702034 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:47.702062 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.702043 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vsdpt for pod openshift-network-diagnostics/network-check-target-ptrgm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:47.702246 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:47.702115 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt podName:a45bf770-bb2a-4a8f-8fa8-60cb36789e8c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:19.7020956 +0000 UTC m=+66.331463860 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-vsdpt" (UniqueName: "kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt") pod "network-check-target-ptrgm" (UID: "a45bf770-bb2a-4a8f-8fa8-60cb36789e8c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:47.950854 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.950765 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:47.950854 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.950831 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 13:59:47.951098 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.950941 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 13:59:47.953743 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.953721 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:47.953952 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.953916 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:47.954039 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.953974 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:47.954039 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.953987 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:59:47.954190 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.954084 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2qtwk\"" Apr 16 13:59:47.954293 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:47.954273 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m8ds6\"" Apr 16 13:59:48.004479 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:48.004439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:48.004671 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:48.004632 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:48.004671 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:48.004651 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c49c4f59-mqzlz: secret "image-registry-tls" not found Apr 16 13:59:48.004749 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:48.004714 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls podName:8ec9d172-ec00-458f-b2d6-85404e6b97bf nodeName:}" failed. No retries permitted until 2026-04-16 13:59:49.004700141 +0000 UTC m=+35.634068399 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls") pod "image-registry-54c49c4f59-mqzlz" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf") : secret "image-registry-tls" not found Apr 16 13:59:48.105618 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:48.105585 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:48.105812 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:48.105669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:48.105812 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:48.105803 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:48.105927 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:48.105830 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:48.105927 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:48.105868 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert podName:aa74ab6f-55fb-4757-9677-130c7dc8c62c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:49.105851771 +0000 UTC m=+35.735220031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert") pod "ingress-canary-wjbcm" (UID: "aa74ab6f-55fb-4757-9677-130c7dc8c62c") : secret "canary-serving-cert" not found Apr 16 13:59:48.105927 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:48.105917 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls podName:eaab260d-b8fe-47b6-8446-b16d19857d43 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:49.105905266 +0000 UTC m=+35.735273523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls") pod "dns-default-v72fr" (UID: "eaab260d-b8fe-47b6-8446-b16d19857d43") : secret "dns-default-metrics-tls" not found Apr 16 13:59:49.015185 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:49.015150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:49.015771 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:49.015301 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:49.015771 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:49.015324 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c49c4f59-mqzlz: secret "image-registry-tls" not found Apr 16 13:59:49.015771 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:49.015396 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls podName:8ec9d172-ec00-458f-b2d6-85404e6b97bf nodeName:}" failed. No retries permitted until 2026-04-16 13:59:51.01537615 +0000 UTC m=+37.644744409 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls") pod "image-registry-54c49c4f59-mqzlz" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf") : secret "image-registry-tls" not found Apr 16 13:59:49.115689 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:49.115661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:49.115834 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:49.115745 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:49.115834 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:49.115810 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:49.115932 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:49.115836 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:49.115932 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:49.115885 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert podName:aa74ab6f-55fb-4757-9677-130c7dc8c62c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:51.115869649 +0000 UTC m=+37.745237906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert") pod "ingress-canary-wjbcm" (UID: "aa74ab6f-55fb-4757-9677-130c7dc8c62c") : secret "canary-serving-cert" not found Apr 16 13:59:49.115932 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:49.115899 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls podName:eaab260d-b8fe-47b6-8446-b16d19857d43 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:51.11589317 +0000 UTC m=+37.745261428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls") pod "dns-default-v72fr" (UID: "eaab260d-b8fe-47b6-8446-b16d19857d43") : secret "dns-default-metrics-tls" not found Apr 16 13:59:49.226402 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:49.226370 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn"] Apr 16 13:59:49.230012 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:49.229988 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm"] Apr 16 13:59:49.244094 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:49.244050 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd"] Apr 16 13:59:49.328949 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:49.328857 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod490046cd_2f06_4014_b0d3_7662ed1b4f8e.slice/crio-a4cb4dcc0a81790ee99a76ca8465752b9beabb75309c2236ea6c1ec275c62762 WatchSource:0}: Error finding container a4cb4dcc0a81790ee99a76ca8465752b9beabb75309c2236ea6c1ec275c62762: Status 404 returned error can't find the container with id a4cb4dcc0a81790ee99a76ca8465752b9beabb75309c2236ea6c1ec275c62762 Apr 16 13:59:49.329117 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:49.329064 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77410159_0fa9_45da_acb5_37356380ab25.slice/crio-9bde61cdc74b4406eba69edf543b6c3a15a29f805a1ec8ca4e5e765e22d18b58 WatchSource:0}: Error finding container 9bde61cdc74b4406eba69edf543b6c3a15a29f805a1ec8ca4e5e765e22d18b58: Status 404 returned error can't find the container with id 9bde61cdc74b4406eba69edf543b6c3a15a29f805a1ec8ca4e5e765e22d18b58 Apr 16 13:59:49.329509 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:49.329491 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda79fd531_9d97_4643_9f14_87092aca16e5.slice/crio-adac7db37d881c28161eaa951b83c9a7b271b3654b45786cd6dfca585df522a5 WatchSource:0}: Error finding container adac7db37d881c28161eaa951b83c9a7b271b3654b45786cd6dfca585df522a5: Status 404 returned error can't find the container with id adac7db37d881c28161eaa951b83c9a7b271b3654b45786cd6dfca585df522a5 Apr 16 13:59:50.117913 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:50.117693 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" event={"ID":"77410159-0fa9-45da-acb5-37356380ab25","Type":"ContainerStarted","Data":"9bde61cdc74b4406eba69edf543b6c3a15a29f805a1ec8ca4e5e765e22d18b58"} Apr 16 13:59:50.119327 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:50.119296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" event={"ID":"490046cd-2f06-4014-b0d3-7662ed1b4f8e","Type":"ContainerStarted","Data":"a4cb4dcc0a81790ee99a76ca8465752b9beabb75309c2236ea6c1ec275c62762"} Apr 16 13:59:50.120407 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:50.120381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" event={"ID":"a79fd531-9d97-4643-9f14-87092aca16e5","Type":"ContainerStarted","Data":"adac7db37d881c28161eaa951b83c9a7b271b3654b45786cd6dfca585df522a5"} Apr 16 13:59:50.123273 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:50.123196 2571 generic.go:358] "Generic (PLEG): container finished" podID="497c5497-4184-4e77-90af-4b9edc13fa89" containerID="fa1752573659e629c8ef9af8d5a71b9b927e14258fb3023713d1eb09c1f470ae" exitCode=0 Apr 16 13:59:50.123273 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:50.123246 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52th9" event={"ID":"497c5497-4184-4e77-90af-4b9edc13fa89","Type":"ContainerDied","Data":"fa1752573659e629c8ef9af8d5a71b9b927e14258fb3023713d1eb09c1f470ae"} Apr 16 13:59:51.036539 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:51.035912 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:51.036539 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:51.036094 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:51.036539 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:51.036113 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c49c4f59-mqzlz: secret "image-registry-tls" not found Apr 16 13:59:51.036539 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:51.036175 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls podName:8ec9d172-ec00-458f-b2d6-85404e6b97bf nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.036155559 +0000 UTC m=+41.665523817 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls") pod "image-registry-54c49c4f59-mqzlz" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf") : secret "image-registry-tls" not found Apr 16 13:59:51.133776 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:51.133669 2571 generic.go:358] "Generic (PLEG): container finished" podID="497c5497-4184-4e77-90af-4b9edc13fa89" containerID="6a70226559c7cb7c98577149a5d4100da17c78d647ef333e38b15341bab0d39c" exitCode=0 Apr 16 13:59:51.133776 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:51.133734 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52th9" event={"ID":"497c5497-4184-4e77-90af-4b9edc13fa89","Type":"ContainerDied","Data":"6a70226559c7cb7c98577149a5d4100da17c78d647ef333e38b15341bab0d39c"} Apr 16 13:59:51.137190 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:51.137161 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:51.137320 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:51.137251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:51.137461 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:51.137439 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:51.137525 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:51.137499 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls podName:eaab260d-b8fe-47b6-8446-b16d19857d43 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.137480516 +0000 UTC m=+41.766848776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls") pod "dns-default-v72fr" (UID: "eaab260d-b8fe-47b6-8446-b16d19857d43") : secret "dns-default-metrics-tls" not found Apr 16 13:59:51.137875 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:51.137858 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:51.137955 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:51.137906 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert podName:aa74ab6f-55fb-4757-9677-130c7dc8c62c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.137892854 +0000 UTC m=+41.767261115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert") pod "ingress-canary-wjbcm" (UID: "aa74ab6f-55fb-4757-9677-130c7dc8c62c") : secret "canary-serving-cert" not found Apr 16 13:59:55.076524 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.076480 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 13:59:55.076924 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:55.076650 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:55.076924 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:55.076673 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c49c4f59-mqzlz: secret "image-registry-tls" not found Apr 16 13:59:55.076924 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:55.076737 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls podName:8ec9d172-ec00-458f-b2d6-85404e6b97bf nodeName:}" failed. No retries permitted until 2026-04-16 14:00:03.076720648 +0000 UTC m=+49.706088910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls") pod "image-registry-54c49c4f59-mqzlz" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf") : secret "image-registry-tls" not found Apr 16 13:59:55.143469 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.143433 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" event={"ID":"a79fd531-9d97-4643-9f14-87092aca16e5","Type":"ContainerStarted","Data":"c4482838e06fcf74b0794295c30cf6f294b132629962f352486957d4ca495814"} Apr 16 13:59:55.146858 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.146828 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-52th9" event={"ID":"497c5497-4184-4e77-90af-4b9edc13fa89","Type":"ContainerStarted","Data":"3e3d8e9e366cf3d0a60c3edcd92a42e43abf5144ae8a1882549cb6418334b214"} Apr 16 13:59:55.148178 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.148155 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" event={"ID":"77410159-0fa9-45da-acb5-37356380ab25","Type":"ContainerStarted","Data":"bfaf7aa7ae8a754d5dafc0b93fa0fe2088a7d5d2d61c97bf3f95f9ced15da42e"} Apr 16 13:59:55.148389 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.148369 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:55.149536 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.149499 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" event={"ID":"490046cd-2f06-4014-b0d3-7662ed1b4f8e","Type":"ContainerStarted","Data":"4a8eda1b5abb7e378b4331fa741277ab689375b82a7ca76137f706eed31accde"} Apr 16 13:59:55.150298 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.150275 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 13:59:55.173274 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.173222 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-52th9" podStartSLOduration=8.021018627 podStartE2EDuration="41.173203461s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 13:59:16.219031101 +0000 UTC m=+2.848399358" lastFinishedPulling="2026-04-16 13:59:49.371215931 +0000 UTC m=+36.000584192" observedRunningTime="2026-04-16 13:59:55.170780085 +0000 UTC m=+41.800148364" watchObservedRunningTime="2026-04-16 13:59:55.173203461 +0000 UTC m=+41.802571738" Apr 16 13:59:55.177359 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.177338 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 13:59:55.177452 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.177417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 13:59:55.177523 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:55.177500 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:55.177580 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:55.177560 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:55.177636 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:55.177569 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert podName:aa74ab6f-55fb-4757-9677-130c7dc8c62c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:03.177551303 +0000 UTC m=+49.806919596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert") pod "ingress-canary-wjbcm" (UID: "aa74ab6f-55fb-4757-9677-130c7dc8c62c") : secret "canary-serving-cert" not found Apr 16 13:59:55.177636 ip-10-0-131-99 kubenswrapper[2571]: E0416 13:59:55.177616 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls podName:eaab260d-b8fe-47b6-8446-b16d19857d43 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:03.177603924 +0000 UTC m=+49.806972181 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls") pod "dns-default-v72fr" (UID: "eaab260d-b8fe-47b6-8446-b16d19857d43") : secret "dns-default-metrics-tls" not found Apr 16 13:59:55.185785 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.185749 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" podStartSLOduration=17.366299071 podStartE2EDuration="22.185736673s" podCreationTimestamp="2026-04-16 13:59:33 +0000 UTC" firstStartedPulling="2026-04-16 13:59:49.348507156 +0000 UTC m=+35.977875420" lastFinishedPulling="2026-04-16 13:59:54.167944759 +0000 UTC m=+40.797313022" observedRunningTime="2026-04-16 13:59:55.184972339 +0000 UTC m=+41.814340619" watchObservedRunningTime="2026-04-16 13:59:55.185736673 +0000 UTC m=+41.815104953" Apr 16 13:59:55.200200 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.200158 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" podStartSLOduration=17.363053493 podStartE2EDuration="22.200144465s" podCreationTimestamp="2026-04-16 13:59:33 +0000 UTC" firstStartedPulling="2026-04-16 13:59:49.348269965 +0000 UTC m=+35.977638222" lastFinishedPulling="2026-04-16 13:59:54.185360925 +0000 UTC m=+40.814729194" observedRunningTime="2026-04-16 13:59:55.199847539 +0000 UTC m=+41.829215819" watchObservedRunningTime="2026-04-16 13:59:55.200144465 +0000 UTC m=+41.829512745" Apr 16 13:59:55.883591 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.883553 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:55.887396 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:55.887368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83635cb4-c000-4ff0-8ff5-171c0a1c00d0-original-pull-secret\") pod \"global-pull-secret-syncer-bcp9m\" (UID: \"83635cb4-c000-4ff0-8ff5-171c0a1c00d0\") " pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:56.075019 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:56.074984 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bcp9m" Apr 16 13:59:56.365942 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:56.365906 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bcp9m"] Apr 16 13:59:56.369684 ip-10-0-131-99 kubenswrapper[2571]: W0416 13:59:56.369654 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83635cb4_c000_4ff0_8ff5_171c0a1c00d0.slice/crio-0ace1710c969e5bb98230fb33f5637ee01a60fa993de064baf7a694c42c2e083 WatchSource:0}: Error finding container 0ace1710c969e5bb98230fb33f5637ee01a60fa993de064baf7a694c42c2e083: Status 404 returned error can't find the container with id 0ace1710c969e5bb98230fb33f5637ee01a60fa993de064baf7a694c42c2e083 Apr 16 13:59:57.155214 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:57.155181 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" event={"ID":"a79fd531-9d97-4643-9f14-87092aca16e5","Type":"ContainerStarted","Data":"fd3f9f8f7b4cd9c5b9505108f2fdc4eab8840a947dcfdf1cb39f83fbcf98cd5c"} Apr 16 13:59:57.155381 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:57.155221 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" event={"ID":"a79fd531-9d97-4643-9f14-87092aca16e5","Type":"ContainerStarted","Data":"b512419993fbdc77c089f06ca199bdd7561051968cb1f415361e443798899318"} Apr 16 13:59:57.156250 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:57.156230 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bcp9m" event={"ID":"83635cb4-c000-4ff0-8ff5-171c0a1c00d0","Type":"ContainerStarted","Data":"0ace1710c969e5bb98230fb33f5637ee01a60fa993de064baf7a694c42c2e083"} Apr 16 13:59:57.174718 ip-10-0-131-99 kubenswrapper[2571]: I0416 13:59:57.174673 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" podStartSLOduration=16.80208542 podStartE2EDuration="24.174661782s" podCreationTimestamp="2026-04-16 13:59:33 +0000 UTC" firstStartedPulling="2026-04-16 13:59:49.348271047 +0000 UTC m=+35.977639304" lastFinishedPulling="2026-04-16 13:59:56.720847398 +0000 UTC m=+43.350215666" observedRunningTime="2026-04-16 13:59:57.17428387 +0000 UTC m=+43.803652230" watchObservedRunningTime="2026-04-16 13:59:57.174661782 +0000 UTC m=+43.804030061" Apr 16 14:00:01.167267 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:01.167233 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bcp9m" event={"ID":"83635cb4-c000-4ff0-8ff5-171c0a1c00d0","Type":"ContainerStarted","Data":"9187886314a5f2d6cedf67f6c41958741a793837b0bdda86227310fd75bc9f8a"} Apr 16 14:00:01.184369 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:01.184323 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bcp9m" podStartSLOduration=34.076739423 podStartE2EDuration="38.18431074s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:56.371381744 +0000 UTC m=+43.000750001" lastFinishedPulling="2026-04-16 14:00:00.478953056 +0000 UTC m=+47.108321318" observedRunningTime="2026-04-16 14:00:01.182719088 +0000 UTC m=+47.812087366" watchObservedRunningTime="2026-04-16 14:00:01.18431074 +0000 UTC m=+47.813679019" Apr 16 14:00:03.141357 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:03.141322 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:00:03.141802 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:03.141481 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:03.141802 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:03.141505 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c49c4f59-mqzlz: secret "image-registry-tls" not found Apr 16 14:00:03.141802 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:03.141587 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls podName:8ec9d172-ec00-458f-b2d6-85404e6b97bf nodeName:}" failed. No retries permitted until 2026-04-16 14:00:19.141566391 +0000 UTC m=+65.770934649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls") pod "image-registry-54c49c4f59-mqzlz" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf") : secret "image-registry-tls" not found Apr 16 14:00:03.241687 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:03.241651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 14:00:03.241839 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:03.241745 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 14:00:03.241839 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:03.241801 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:03.241839 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:03.241819 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:03.241927 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:03.241868 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls podName:eaab260d-b8fe-47b6-8446-b16d19857d43 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:19.241852325 +0000 UTC m=+65.871220594 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls") pod "dns-default-v72fr" (UID: "eaab260d-b8fe-47b6-8446-b16d19857d43") : secret "dns-default-metrics-tls" not found Apr 16 14:00:03.241927 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:03.241883 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert podName:aa74ab6f-55fb-4757-9677-130c7dc8c62c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:19.24187749 +0000 UTC m=+65.871245747 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert") pod "ingress-canary-wjbcm" (UID: "aa74ab6f-55fb-4757-9677-130c7dc8c62c") : secret "canary-serving-cert" not found Apr 16 14:00:12.117419 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:12.117389 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4q9n5" Apr 16 14:00:19.159434 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:19.159393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:00:19.159955 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:19.159566 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:19.159955 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:19.159590 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c49c4f59-mqzlz: secret "image-registry-tls" not found Apr 16 14:00:19.159955 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:19.159664 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls podName:8ec9d172-ec00-458f-b2d6-85404e6b97bf nodeName:}" failed. No retries permitted until 2026-04-16 14:00:51.159643219 +0000 UTC m=+97.789011480 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls") pod "image-registry-54c49c4f59-mqzlz" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf") : secret "image-registry-tls" not found Apr 16 14:00:19.260556 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:19.260523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 14:00:19.260725 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:19.260589 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 14:00:19.260725 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:19.260666 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:19.260725 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:19.260700 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:19.260824 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:19.260730 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert podName:aa74ab6f-55fb-4757-9677-130c7dc8c62c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:51.26071457 +0000 UTC m=+97.890082826 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert") pod "ingress-canary-wjbcm" (UID: "aa74ab6f-55fb-4757-9677-130c7dc8c62c") : secret "canary-serving-cert" not found Apr 16 14:00:19.260824 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:19.260749 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls podName:eaab260d-b8fe-47b6-8446-b16d19857d43 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:51.260736783 +0000 UTC m=+97.890105043 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls") pod "dns-default-v72fr" (UID: "eaab260d-b8fe-47b6-8446-b16d19857d43") : secret "dns-default-metrics-tls" not found Apr 16 14:00:19.663122 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:19.663085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 14:00:19.665962 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:19.665942 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:19.673539 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:19.673514 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:19.673598 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:19.673588 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs podName:97f73dc3-4dcf-4643-8dc6-cd6e6418679b nodeName:}" failed. No retries permitted until 2026-04-16 14:01:23.673571908 +0000 UTC m=+130.302940164 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs") pod "network-metrics-daemon-gg599" (UID: "97f73dc3-4dcf-4643-8dc6-cd6e6418679b") : secret "metrics-daemon-secret" not found Apr 16 14:00:19.763586 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:19.763546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 14:00:19.766318 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:19.766300 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:19.776462 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:19.776441 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:19.786858 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:19.786826 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsdpt\" (UniqueName: \"kubernetes.io/projected/a45bf770-bb2a-4a8f-8fa8-60cb36789e8c-kube-api-access-vsdpt\") pod \"network-check-target-ptrgm\" (UID: \"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c\") " pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 14:00:20.071877 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:20.071804 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m8ds6\"" Apr 16 14:00:20.080107 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:20.080088 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 14:00:20.193512 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:20.193484 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ptrgm"] Apr 16 14:00:20.196756 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:00:20.196724 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda45bf770_bb2a_4a8f_8fa8_60cb36789e8c.slice/crio-4a8ff947d6db4a604b1875c8ab6fb25e033ac0618d1db0e98b2046b00345ad2f WatchSource:0}: Error finding container 4a8ff947d6db4a604b1875c8ab6fb25e033ac0618d1db0e98b2046b00345ad2f: Status 404 returned error can't find the container with id 4a8ff947d6db4a604b1875c8ab6fb25e033ac0618d1db0e98b2046b00345ad2f Apr 16 14:00:20.212475 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:20.212445 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ptrgm" event={"ID":"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c","Type":"ContainerStarted","Data":"4a8ff947d6db4a604b1875c8ab6fb25e033ac0618d1db0e98b2046b00345ad2f"} Apr 16 14:00:23.220353 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:23.220320 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ptrgm" event={"ID":"a45bf770-bb2a-4a8f-8fa8-60cb36789e8c","Type":"ContainerStarted","Data":"cfca1d3af3a4973dc0e9761a83b8749247541569fa090fed612b342597216cb2"} Apr 16 14:00:23.220746 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:23.220461 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 14:00:23.237338 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:23.237293 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ptrgm" podStartSLOduration=66.34518507 podStartE2EDuration="1m9.237281102s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 14:00:20.198797733 +0000 UTC m=+66.828165995" lastFinishedPulling="2026-04-16 14:00:23.090893756 +0000 UTC m=+69.720262027" observedRunningTime="2026-04-16 14:00:23.236736762 +0000 UTC m=+69.866105042" watchObservedRunningTime="2026-04-16 14:00:23.237281102 +0000 UTC m=+69.866649381" Apr 16 14:00:51.202140 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:51.202096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:00:51.202578 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:51.202219 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:51.202578 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:51.202231 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54c49c4f59-mqzlz: secret "image-registry-tls" not found Apr 16 14:00:51.202578 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:51.202289 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls podName:8ec9d172-ec00-458f-b2d6-85404e6b97bf nodeName:}" failed. No retries permitted until 2026-04-16 14:01:55.202276795 +0000 UTC m=+161.831645051 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls") pod "image-registry-54c49c4f59-mqzlz" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf") : secret "image-registry-tls" not found Apr 16 14:00:51.302462 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:51.302426 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 14:00:51.302628 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:51.302484 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 14:00:51.302628 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:51.302599 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:51.302699 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:51.302666 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls podName:eaab260d-b8fe-47b6-8446-b16d19857d43 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:55.302651696 +0000 UTC m=+161.932019958 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls") pod "dns-default-v72fr" (UID: "eaab260d-b8fe-47b6-8446-b16d19857d43") : secret "dns-default-metrics-tls" not found Apr 16 14:00:51.302699 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:51.302599 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:51.302766 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:00:51.302706 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert podName:aa74ab6f-55fb-4757-9677-130c7dc8c62c nodeName:}" failed. No retries permitted until 2026-04-16 14:01:55.302696913 +0000 UTC m=+161.932065192 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert") pod "ingress-canary-wjbcm" (UID: "aa74ab6f-55fb-4757-9677-130c7dc8c62c") : secret "canary-serving-cert" not found Apr 16 14:00:54.225529 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:00:54.225497 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ptrgm" Apr 16 14:01:23.746672 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:23.746641 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 14:01:23.747120 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:01:23.746785 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:01:23.747120 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:01:23.746849 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs podName:97f73dc3-4dcf-4643-8dc6-cd6e6418679b nodeName:}" failed. No retries permitted until 2026-04-16 14:03:25.746834308 +0000 UTC m=+252.376202565 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs") pod "network-metrics-daemon-gg599" (UID: "97f73dc3-4dcf-4643-8dc6-cd6e6418679b") : secret "metrics-daemon-secret" not found Apr 16 14:01:23.912759 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:23.912733 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l47f6_bb45c2a8-3222-492e-a359-cd27a52d6faa/dns-node-resolver/0.log" Apr 16 14:01:24.913366 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:24.913339 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h95cv_444c36ba-0722-4b97-88e0-a10913a4f6b4/node-ca/0.log" Apr 16 14:01:27.624001 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:27.623942 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" podUID="a79fd531-9d97-4643-9f14-87092aca16e5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:37.624774 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:37.624736 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" podUID="a79fd531-9d97-4643-9f14-87092aca16e5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:45.221995 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.221954 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cw5lz"] Apr 16 14:01:45.223995 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.223980 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.231347 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.231327 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:01:45.232493 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.232476 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:01:45.232577 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.232562 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:01:45.238808 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.238790 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:01:45.239531 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.239514 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fhg6t\"" Apr 16 14:01:45.259802 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.259768 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cw5lz"] Apr 16 14:01:45.315020 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.314990 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/992a5177-6c6b-4d11-8f10-1218ec3dbd79-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.315199 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.315047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48srh\" (UniqueName: \"kubernetes.io/projected/992a5177-6c6b-4d11-8f10-1218ec3dbd79-kube-api-access-48srh\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.315199 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.315101 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/992a5177-6c6b-4d11-8f10-1218ec3dbd79-data-volume\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.315199 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.315146 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/992a5177-6c6b-4d11-8f10-1218ec3dbd79-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.315199 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.315181 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/992a5177-6c6b-4d11-8f10-1218ec3dbd79-crio-socket\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.415772 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.415737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/992a5177-6c6b-4d11-8f10-1218ec3dbd79-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.415948 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.415801 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48srh\" (UniqueName: \"kubernetes.io/projected/992a5177-6c6b-4d11-8f10-1218ec3dbd79-kube-api-access-48srh\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.415948 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.415831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/992a5177-6c6b-4d11-8f10-1218ec3dbd79-data-volume\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.415948 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.415851 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/992a5177-6c6b-4d11-8f10-1218ec3dbd79-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.415948 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.415891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/992a5177-6c6b-4d11-8f10-1218ec3dbd79-crio-socket\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.416119 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.415976 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/992a5177-6c6b-4d11-8f10-1218ec3dbd79-crio-socket\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.416269 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.416249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/992a5177-6c6b-4d11-8f10-1218ec3dbd79-data-volume\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.416472 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.416454 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/992a5177-6c6b-4d11-8f10-1218ec3dbd79-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.418156 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.418137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/992a5177-6c6b-4d11-8f10-1218ec3dbd79-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.432506 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.432481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48srh\" (UniqueName: \"kubernetes.io/projected/992a5177-6c6b-4d11-8f10-1218ec3dbd79-kube-api-access-48srh\") pod \"insights-runtime-extractor-cw5lz\" (UID: \"992a5177-6c6b-4d11-8f10-1218ec3dbd79\") " pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.531978 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.531892 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cw5lz" Apr 16 14:01:45.657662 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:45.657622 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cw5lz"] Apr 16 14:01:45.661732 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:01:45.661706 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod992a5177_6c6b_4d11_8f10_1218ec3dbd79.slice/crio-ec65de9e137d03492252dcc71eeb3281300eefae209a2be3883a69e5bc1489a2 WatchSource:0}: Error finding container ec65de9e137d03492252dcc71eeb3281300eefae209a2be3883a69e5bc1489a2: Status 404 returned error can't find the container with id ec65de9e137d03492252dcc71eeb3281300eefae209a2be3883a69e5bc1489a2 Apr 16 14:01:46.424942 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:46.424905 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cw5lz" event={"ID":"992a5177-6c6b-4d11-8f10-1218ec3dbd79","Type":"ContainerStarted","Data":"dfeb19250d2b46932b3ca98b0623d525a2c33d329dc4a0bd86eeaf92bed16107"} Apr 16 14:01:46.424942 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:46.424944 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cw5lz" event={"ID":"992a5177-6c6b-4d11-8f10-1218ec3dbd79","Type":"ContainerStarted","Data":"ec65de9e137d03492252dcc71eeb3281300eefae209a2be3883a69e5bc1489a2"} Apr 16 14:01:47.428846 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:47.428814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cw5lz" event={"ID":"992a5177-6c6b-4d11-8f10-1218ec3dbd79","Type":"ContainerStarted","Data":"1d818ba212495a7919586b41ca36f1e3580da6c537ab9d500fcf018f7bf16cf3"} Apr 16 14:01:47.624532 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:47.624492 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" podUID="a79fd531-9d97-4643-9f14-87092aca16e5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:47.624680 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:47.624566 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" Apr 16 14:01:47.625040 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:47.625012 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"fd3f9f8f7b4cd9c5b9505108f2fdc4eab8840a947dcfdf1cb39f83fbcf98cd5c"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 14:01:47.625112 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:47.625097 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" podUID="a79fd531-9d97-4643-9f14-87092aca16e5" containerName="service-proxy" containerID="cri-o://fd3f9f8f7b4cd9c5b9505108f2fdc4eab8840a947dcfdf1cb39f83fbcf98cd5c" gracePeriod=30 Apr 16 14:01:48.433784 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:48.433745 2571 generic.go:358] "Generic (PLEG): container finished" podID="a79fd531-9d97-4643-9f14-87092aca16e5" containerID="fd3f9f8f7b4cd9c5b9505108f2fdc4eab8840a947dcfdf1cb39f83fbcf98cd5c" exitCode=2 Apr 16 14:01:48.434251 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:48.433813 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" event={"ID":"a79fd531-9d97-4643-9f14-87092aca16e5","Type":"ContainerDied","Data":"fd3f9f8f7b4cd9c5b9505108f2fdc4eab8840a947dcfdf1cb39f83fbcf98cd5c"} Apr 16 14:01:48.434251 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:48.433860 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5847b58c87-sgngd" event={"ID":"a79fd531-9d97-4643-9f14-87092aca16e5","Type":"ContainerStarted","Data":"4e21d9c70108370fcc6a5ffc197af48c96606b290133183accbfbef28e5d34cb"} Apr 16 14:01:49.437958 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:49.437925 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cw5lz" event={"ID":"992a5177-6c6b-4d11-8f10-1218ec3dbd79","Type":"ContainerStarted","Data":"8ab0b7a72492ba521aed13a606f363a699f3e2971c4c09bd7653a8746d9f69c5"} Apr 16 14:01:49.458644 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:49.458596 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cw5lz" podStartSLOduration=1.651365637 podStartE2EDuration="4.458581241s" podCreationTimestamp="2026-04-16 14:01:45 +0000 UTC" firstStartedPulling="2026-04-16 14:01:45.710643883 +0000 UTC m=+152.340012140" lastFinishedPulling="2026-04-16 14:01:48.517859486 +0000 UTC m=+155.147227744" observedRunningTime="2026-04-16 14:01:49.457643991 +0000 UTC m=+156.087012270" watchObservedRunningTime="2026-04-16 14:01:49.458581241 +0000 UTC m=+156.087949520" Apr 16 14:01:50.312727 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:01:50.312632 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" podUID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" Apr 16 14:01:50.361020 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:01:50.360977 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wjbcm" podUID="aa74ab6f-55fb-4757-9677-130c7dc8c62c" Apr 16 14:01:50.398502 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:01:50.398465 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-v72fr" podUID="eaab260d-b8fe-47b6-8446-b16d19857d43" Apr 16 14:01:50.440381 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:50.440351 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:01:50.962683 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:01:50.962625 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gg599" podUID="97f73dc3-4dcf-4643-8dc6-cd6e6418679b" Apr 16 14:01:52.916151 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:52.916119 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mzqsp"] Apr 16 14:01:52.918478 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:52.918455 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:52.923339 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:52.923312 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:52.923487 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:52.923391 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qj4mz\"" Apr 16 14:01:52.923487 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:52.923425 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:52.923645 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:52.923630 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:01:52.923701 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:52.923668 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:52.924699 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:52.924410 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:01:52.924699 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:52.924420 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:53.080916 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.080886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-sys\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.081143 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.080937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-textfile\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.081143 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.080984 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-metrics-client-ca\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.081143 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.081024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-accelerators-collector-config\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.081143 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.081097 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-tls\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.081324 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.081167 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.081324 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.081237 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnrc\" (UniqueName: \"kubernetes.io/projected/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-kube-api-access-8mnrc\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.081324 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.081295 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-root\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.081437 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.081342 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-wtmp\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.182543 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182449 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-metrics-client-ca\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.182543 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-accelerators-collector-config\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.182740 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182553 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-tls\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.182740 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.182740 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnrc\" (UniqueName: \"kubernetes.io/projected/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-kube-api-access-8mnrc\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.182740 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-root\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.182740 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-wtmp\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.183045 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-sys\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.183045 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182768 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-root\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.183045 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-textfile\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.183045 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-wtmp\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.183045 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.182956 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-sys\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.183275 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.183111 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-metrics-client-ca\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.183275 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.183127 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-textfile\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.183275 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.183178 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-accelerators-collector-config\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.184815 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.184795 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.184960 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.184945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-node-exporter-tls\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.190814 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.190795 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnrc\" (UniqueName: \"kubernetes.io/projected/0a21b3db-dd0e-44d3-8703-4ba945e6e96c-kube-api-access-8mnrc\") pod \"node-exporter-mzqsp\" (UID: \"0a21b3db-dd0e-44d3-8703-4ba945e6e96c\") " pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.226919 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.226890 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mzqsp" Apr 16 14:01:53.235057 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:01:53.235027 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a21b3db_dd0e_44d3_8703_4ba945e6e96c.slice/crio-4839d283f6b45c4d3a7e7a3838c15f891c35436937278b1a374dcb949754c920 WatchSource:0}: Error finding container 4839d283f6b45c4d3a7e7a3838c15f891c35436937278b1a374dcb949754c920: Status 404 returned error can't find the container with id 4839d283f6b45c4d3a7e7a3838c15f891c35436937278b1a374dcb949754c920 Apr 16 14:01:53.449107 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:53.449007 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mzqsp" event={"ID":"0a21b3db-dd0e-44d3-8703-4ba945e6e96c","Type":"ContainerStarted","Data":"4839d283f6b45c4d3a7e7a3838c15f891c35436937278b1a374dcb949754c920"} Apr 16 14:01:54.453708 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:54.453678 2571 generic.go:358] "Generic (PLEG): container finished" podID="77410159-0fa9-45da-acb5-37356380ab25" containerID="bfaf7aa7ae8a754d5dafc0b93fa0fe2088a7d5d2d61c97bf3f95f9ced15da42e" exitCode=1 Apr 16 14:01:54.454048 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:54.453744 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" event={"ID":"77410159-0fa9-45da-acb5-37356380ab25","Type":"ContainerDied","Data":"bfaf7aa7ae8a754d5dafc0b93fa0fe2088a7d5d2d61c97bf3f95f9ced15da42e"} Apr 16 14:01:54.454149 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:54.454132 2571 scope.go:117] "RemoveContainer" containerID="bfaf7aa7ae8a754d5dafc0b93fa0fe2088a7d5d2d61c97bf3f95f9ced15da42e" Apr 16 14:01:54.455106 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:54.455084 2571 generic.go:358] "Generic (PLEG): container finished" podID="490046cd-2f06-4014-b0d3-7662ed1b4f8e" containerID="4a8eda1b5abb7e378b4331fa741277ab689375b82a7ca76137f706eed31accde" exitCode=255 Apr 16 14:01:54.455106 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:54.455096 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" event={"ID":"490046cd-2f06-4014-b0d3-7662ed1b4f8e","Type":"ContainerDied","Data":"4a8eda1b5abb7e378b4331fa741277ab689375b82a7ca76137f706eed31accde"} Apr 16 14:01:54.455402 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:54.455387 2571 scope.go:117] "RemoveContainer" containerID="4a8eda1b5abb7e378b4331fa741277ab689375b82a7ca76137f706eed31accde" Apr 16 14:01:54.456667 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:54.456554 2571 generic.go:358] "Generic (PLEG): container finished" podID="0a21b3db-dd0e-44d3-8703-4ba945e6e96c" containerID="52a6be0409e2417cb21b020fda2356335f5f8584389e2c3d9fa88d33f2c2149b" exitCode=0 Apr 16 14:01:54.456667 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:54.456598 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mzqsp" event={"ID":"0a21b3db-dd0e-44d3-8703-4ba945e6e96c","Type":"ContainerDied","Data":"52a6be0409e2417cb21b020fda2356335f5f8584389e2c3d9fa88d33f2c2149b"} Apr 16 14:01:55.149293 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.149237 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 14:01:55.299609 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.299569 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:01:55.301964 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.301939 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"image-registry-54c49c4f59-mqzlz\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:01:55.400025 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.399930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 14:01:55.400025 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.399994 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 14:01:55.402307 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.402276 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa74ab6f-55fb-4757-9677-130c7dc8c62c-cert\") pod \"ingress-canary-wjbcm\" (UID: \"aa74ab6f-55fb-4757-9677-130c7dc8c62c\") " pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 14:01:55.402418 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.402342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eaab260d-b8fe-47b6-8446-b16d19857d43-metrics-tls\") pod \"dns-default-v72fr\" (UID: \"eaab260d-b8fe-47b6-8446-b16d19857d43\") " pod="openshift-dns/dns-default-v72fr" Apr 16 14:01:55.461089 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.461034 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" event={"ID":"77410159-0fa9-45da-acb5-37356380ab25","Type":"ContainerStarted","Data":"79cc4006ad48bfd60967756046c436e4a8ac9d36886fb6d0745eb706e63dd586"} Apr 16 14:01:55.461534 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.461306 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 14:01:55.461985 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.461965 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ccfc5dbb9-hkqhm" Apr 16 14:01:55.462684 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.462664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68f559f9c9-d5mjn" event={"ID":"490046cd-2f06-4014-b0d3-7662ed1b4f8e","Type":"ContainerStarted","Data":"fce90c9144e5c8e978072304962f936c152457d6420829ca62566564e3b51f6f"} Apr 16 14:01:55.464262 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.464237 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mzqsp" event={"ID":"0a21b3db-dd0e-44d3-8703-4ba945e6e96c","Type":"ContainerStarted","Data":"981148a0a3a6ff4b75870ff3800a91ace856246f639089bca88ddacbfc94bca6"} Apr 16 14:01:55.464364 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.464268 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mzqsp" event={"ID":"0a21b3db-dd0e-44d3-8703-4ba945e6e96c","Type":"ContainerStarted","Data":"d92d54124e213c1b34b36dbe2f1ddec50180a517caf47cdeec78c841032a4047"} Apr 16 14:01:55.500432 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.500384 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mzqsp" podStartSLOduration=2.570419169 podStartE2EDuration="3.500371345s" podCreationTimestamp="2026-04-16 14:01:52 +0000 UTC" firstStartedPulling="2026-04-16 14:01:53.236949096 +0000 UTC m=+159.866317354" lastFinishedPulling="2026-04-16 14:01:54.166901255 +0000 UTC m=+160.796269530" observedRunningTime="2026-04-16 14:01:55.49989169 +0000 UTC m=+162.129259970" watchObservedRunningTime="2026-04-16 14:01:55.500371345 +0000 UTC m=+162.129739623" Apr 16 14:01:55.543552 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.543526 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jfbd2\"" Apr 16 14:01:55.551740 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.551714 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:01:55.671088 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:55.670996 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54c49c4f59-mqzlz"] Apr 16 14:01:55.673955 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:01:55.673921 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ec9d172_ec00_458f_b2d6_85404e6b97bf.slice/crio-9bf04df28b6f8279060e1d193b8902e8e2d8e1b09f1f49156b612c876d827a09 WatchSource:0}: Error finding container 9bf04df28b6f8279060e1d193b8902e8e2d8e1b09f1f49156b612c876d827a09: Status 404 returned error can't find the container with id 9bf04df28b6f8279060e1d193b8902e8e2d8e1b09f1f49156b612c876d827a09 Apr 16 14:01:56.467845 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:56.467805 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" event={"ID":"8ec9d172-ec00-458f-b2d6-85404e6b97bf","Type":"ContainerStarted","Data":"8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae"} Apr 16 14:01:56.467845 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:56.467847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" event={"ID":"8ec9d172-ec00-458f-b2d6-85404e6b97bf","Type":"ContainerStarted","Data":"9bf04df28b6f8279060e1d193b8902e8e2d8e1b09f1f49156b612c876d827a09"} Apr 16 14:01:57.470951 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:01:57.470914 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:02:03.952098 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:03.951997 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 14:02:03.954959 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:03.954938 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-r2bfh\"" Apr 16 14:02:03.963279 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:03.963264 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wjbcm" Apr 16 14:02:04.075553 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:04.075506 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" podStartSLOduration=170.07548804 podStartE2EDuration="2m50.07548804s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:56.488988247 +0000 UTC m=+163.118356548" watchObservedRunningTime="2026-04-16 14:02:04.07548804 +0000 UTC m=+170.704856319" Apr 16 14:02:04.076225 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:04.076207 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wjbcm"] Apr 16 14:02:04.079302 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:02:04.079274 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa74ab6f_55fb_4757_9677_130c7dc8c62c.slice/crio-1a1884a76db4dea10493c93f90f004b9afe1e42a7a56cac62e278f304a77663d WatchSource:0}: Error finding container 1a1884a76db4dea10493c93f90f004b9afe1e42a7a56cac62e278f304a77663d: Status 404 returned error can't find the container with id 1a1884a76db4dea10493c93f90f004b9afe1e42a7a56cac62e278f304a77663d Apr 16 14:02:04.489217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:04.489187 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wjbcm" event={"ID":"aa74ab6f-55fb-4757-9677-130c7dc8c62c","Type":"ContainerStarted","Data":"1a1884a76db4dea10493c93f90f004b9afe1e42a7a56cac62e278f304a77663d"} Apr 16 14:02:05.955002 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:05.954976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v72fr" Apr 16 14:02:05.955431 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:05.954976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 14:02:05.958228 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:05.958202 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5ngv\"" Apr 16 14:02:05.965806 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:05.965785 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v72fr" Apr 16 14:02:06.172278 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:06.172242 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v72fr"] Apr 16 14:02:06.177104 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:02:06.177056 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaab260d_b8fe_47b6_8446_b16d19857d43.slice/crio-4c53861de60c147e7309748ca0b79d3011b3512031d257189df0a9bd90197ce2 WatchSource:0}: Error finding container 4c53861de60c147e7309748ca0b79d3011b3512031d257189df0a9bd90197ce2: Status 404 returned error can't find the container with id 4c53861de60c147e7309748ca0b79d3011b3512031d257189df0a9bd90197ce2 Apr 16 14:02:06.495986 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:06.495942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wjbcm" event={"ID":"aa74ab6f-55fb-4757-9677-130c7dc8c62c","Type":"ContainerStarted","Data":"ed343f7b3e96a974ef7e87f13e9ee8bb645a0ea045d930e35cd58f7c683c1cbd"} Apr 16 14:02:06.496922 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:06.496903 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v72fr" event={"ID":"eaab260d-b8fe-47b6-8446-b16d19857d43","Type":"ContainerStarted","Data":"4c53861de60c147e7309748ca0b79d3011b3512031d257189df0a9bd90197ce2"} Apr 16 14:02:06.512629 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:06.512565 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wjbcm" podStartSLOduration=137.512519604 podStartE2EDuration="2m19.512550883s" podCreationTimestamp="2026-04-16 13:59:47 +0000 UTC" firstStartedPulling="2026-04-16 14:02:04.081210168 +0000 UTC m=+170.710578439" lastFinishedPulling="2026-04-16 14:02:06.081241446 +0000 UTC m=+172.710609718" observedRunningTime="2026-04-16 14:02:06.511713743 +0000 UTC m=+173.141082044" watchObservedRunningTime="2026-04-16 14:02:06.512550883 +0000 UTC m=+173.141919162" Apr 16 14:02:08.504198 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:08.504162 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v72fr" event={"ID":"eaab260d-b8fe-47b6-8446-b16d19857d43","Type":"ContainerStarted","Data":"e51c3ec896d46886711366ea4c999eb6b72d934dbe97ff5ec0fb53dce77fb9b2"} Apr 16 14:02:08.504198 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:08.504199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v72fr" event={"ID":"eaab260d-b8fe-47b6-8446-b16d19857d43","Type":"ContainerStarted","Data":"b643abd77707b83a9395e146194ee654b83c4ca9eae3a680fc414fc2c0b9df54"} Apr 16 14:02:08.504733 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:08.504353 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-v72fr" Apr 16 14:02:08.524942 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:08.524898 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v72fr" podStartSLOduration=140.257875487 podStartE2EDuration="2m21.524885589s" podCreationTimestamp="2026-04-16 13:59:47 +0000 UTC" firstStartedPulling="2026-04-16 14:02:06.179275761 +0000 UTC m=+172.808644018" lastFinishedPulling="2026-04-16 14:02:07.44628585 +0000 UTC m=+174.075654120" observedRunningTime="2026-04-16 14:02:08.523723146 +0000 UTC m=+175.153091427" watchObservedRunningTime="2026-04-16 14:02:08.524885589 +0000 UTC m=+175.154253868" Apr 16 14:02:15.556463 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:15.556419 2571 patch_prober.go:28] interesting pod/image-registry-54c49c4f59-mqzlz container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:15.556836 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:15.556475 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" podUID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:18.477816 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:18.477783 2571 patch_prober.go:28] interesting pod/image-registry-54c49c4f59-mqzlz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:18.478195 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:18.477835 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" podUID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:18.509349 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:18.509324 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v72fr" Apr 16 14:02:25.557177 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:25.557137 2571 patch_prober.go:28] interesting pod/image-registry-54c49c4f59-mqzlz container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:25.557702 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:25.557205 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" podUID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:28.477158 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:28.477127 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:02:35.684761 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:02:35.684724 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54c49c4f59-mqzlz"] Apr 16 14:03:00.702954 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:00.702885 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" podUID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" containerName="registry" containerID="cri-o://8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae" gracePeriod=30 Apr 16 14:03:01.942736 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:01.942714 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:03:02.104700 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.104604 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqh2c\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-kube-api-access-mqh2c\") pod \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " Apr 16 14:03:02.104700 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.104652 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-trusted-ca\") pod \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " Apr 16 14:03:02.104700 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.104685 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ec9d172-ec00-458f-b2d6-85404e6b97bf-ca-trust-extracted\") pod \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " Apr 16 14:03:02.105025 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.104719 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-image-registry-private-configuration\") pod \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " Apr 16 14:03:02.105025 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.104735 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-certificates\") pod \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " Apr 16 14:03:02.105025 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.104760 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") pod \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " Apr 16 14:03:02.105025 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.104793 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-installation-pull-secrets\") pod \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " Apr 16 14:03:02.105025 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.104817 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-bound-sa-token\") pod \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\" (UID: \"8ec9d172-ec00-458f-b2d6-85404e6b97bf\") " Apr 16 14:03:02.105309 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.105232 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8ec9d172-ec00-458f-b2d6-85404e6b97bf" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:02.105309 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.105244 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8ec9d172-ec00-458f-b2d6-85404e6b97bf" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:02.107298 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.107261 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8ec9d172-ec00-458f-b2d6-85404e6b97bf" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:02.107427 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.107369 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-kube-api-access-mqh2c" (OuterVolumeSpecName: "kube-api-access-mqh2c") pod "8ec9d172-ec00-458f-b2d6-85404e6b97bf" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf"). InnerVolumeSpecName "kube-api-access-mqh2c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:02.107427 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.107376 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8ec9d172-ec00-458f-b2d6-85404e6b97bf" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:02.107513 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.107422 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8ec9d172-ec00-458f-b2d6-85404e6b97bf" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:02.107513 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.107473 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8ec9d172-ec00-458f-b2d6-85404e6b97bf" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:02.121974 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.121947 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec9d172-ec00-458f-b2d6-85404e6b97bf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8ec9d172-ec00-458f-b2d6-85404e6b97bf" (UID: "8ec9d172-ec00-458f-b2d6-85404e6b97bf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:02.206039 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.205990 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-tls\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.206039 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.206034 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-installation-pull-secrets\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.206235 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.206062 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-bound-sa-token\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.206235 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.206090 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqh2c\" (UniqueName: \"kubernetes.io/projected/8ec9d172-ec00-458f-b2d6-85404e6b97bf-kube-api-access-mqh2c\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.206235 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.206099 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-trusted-ca\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.206235 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.206108 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ec9d172-ec00-458f-b2d6-85404e6b97bf-ca-trust-extracted\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.206235 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.206118 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8ec9d172-ec00-458f-b2d6-85404e6b97bf-image-registry-private-configuration\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.206235 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.206128 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ec9d172-ec00-458f-b2d6-85404e6b97bf-registry-certificates\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.641245 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.641209 2571 generic.go:358] "Generic (PLEG): container finished" podID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" containerID="8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae" exitCode=0 Apr 16 14:03:02.641405 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.641265 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" Apr 16 14:03:02.641405 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.641297 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" event={"ID":"8ec9d172-ec00-458f-b2d6-85404e6b97bf","Type":"ContainerDied","Data":"8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae"} Apr 16 14:03:02.641405 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.641332 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c49c4f59-mqzlz" event={"ID":"8ec9d172-ec00-458f-b2d6-85404e6b97bf","Type":"ContainerDied","Data":"9bf04df28b6f8279060e1d193b8902e8e2d8e1b09f1f49156b612c876d827a09"} Apr 16 14:03:02.641405 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.641346 2571 scope.go:117] "RemoveContainer" containerID="8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae" Apr 16 14:03:02.649732 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.649717 2571 scope.go:117] "RemoveContainer" containerID="8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae" Apr 16 14:03:02.649973 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:03:02.649954 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae\": container with ID starting with 8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae not found: ID does not exist" containerID="8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae" Apr 16 14:03:02.650034 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.649980 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae"} err="failed to get container status \"8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae\": rpc error: code = NotFound desc = could not find container \"8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae\": container with ID starting with 8c608d340e5068564f9664ac7bda7f178029ba5265797f7e3978d9d5278cd7ae not found: ID does not exist" Apr 16 14:03:02.663421 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.663396 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54c49c4f59-mqzlz"] Apr 16 14:03:02.667935 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:02.667912 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-54c49c4f59-mqzlz"] Apr 16 14:03:03.954858 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:03.954826 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" path="/var/lib/kubelet/pods/8ec9d172-ec00-458f-b2d6-85404e6b97bf/volumes" Apr 16 14:03:25.772481 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:25.772430 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 14:03:25.774836 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:25.774804 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97f73dc3-4dcf-4643-8dc6-cd6e6418679b-metrics-certs\") pod \"network-metrics-daemon-gg599\" (UID: \"97f73dc3-4dcf-4643-8dc6-cd6e6418679b\") " pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 14:03:26.059456 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:26.059376 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2qtwk\"" Apr 16 14:03:26.066583 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:26.066563 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gg599" Apr 16 14:03:26.210914 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:26.210865 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gg599"] Apr 16 14:03:26.214792 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:03:26.214764 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f73dc3_4dcf_4643_8dc6_cd6e6418679b.slice/crio-755fa3804160357397c3e9cca9d4d90f5f935502ead2387e4fd89c301393dd73 WatchSource:0}: Error finding container 755fa3804160357397c3e9cca9d4d90f5f935502ead2387e4fd89c301393dd73: Status 404 returned error can't find the container with id 755fa3804160357397c3e9cca9d4d90f5f935502ead2387e4fd89c301393dd73 Apr 16 14:03:26.702398 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:26.702358 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gg599" event={"ID":"97f73dc3-4dcf-4643-8dc6-cd6e6418679b","Type":"ContainerStarted","Data":"755fa3804160357397c3e9cca9d4d90f5f935502ead2387e4fd89c301393dd73"} Apr 16 14:03:27.709290 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:27.709259 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gg599" event={"ID":"97f73dc3-4dcf-4643-8dc6-cd6e6418679b","Type":"ContainerStarted","Data":"dfa3f95853bef971703392055836417279eb7c25bc10f909a7e9e2ea73c1d2a1"} Apr 16 14:03:27.709658 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:27.709296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gg599" event={"ID":"97f73dc3-4dcf-4643-8dc6-cd6e6418679b","Type":"ContainerStarted","Data":"c2d699a6b781a5b93537b0b2eb8b951b8e04a563602a0921c85fdc9a5b584ad4"} Apr 16 14:03:27.727647 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:03:27.727600 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gg599" podStartSLOduration=252.741329076 podStartE2EDuration="4m13.727585348s" podCreationTimestamp="2026-04-16 13:59:14 +0000 UTC" firstStartedPulling="2026-04-16 14:03:26.216512495 +0000 UTC m=+252.845880752" lastFinishedPulling="2026-04-16 14:03:27.202768766 +0000 UTC m=+253.832137024" observedRunningTime="2026-04-16 14:03:27.725643794 +0000 UTC m=+254.355012073" watchObservedRunningTime="2026-04-16 14:03:27.727585348 +0000 UTC m=+254.356953673" Apr 16 14:04:13.858532 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:04:13.858505 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:04:13.860709 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:04:13.860686 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:04:13.867448 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:04:13.867427 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:07:00.252940 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.252905 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-5njvw"] Apr 16 14:07:00.253378 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.253133 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" containerName="registry" Apr 16 14:07:00.253378 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.253144 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" containerName="registry" Apr 16 14:07:00.253378 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.253189 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ec9d172-ec00-458f-b2d6-85404e6b97bf" containerName="registry" Apr 16 14:07:00.255882 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.255866 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:00.258864 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.258839 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 14:07:00.258994 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.258895 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 14:07:00.258994 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.258921 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-4gp8h\"" Apr 16 14:07:00.266297 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.266274 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-5njvw"] Apr 16 14:07:00.345409 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.345373 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgrxw\" (UniqueName: \"kubernetes.io/projected/c41c29c4-1653-42ee-a908-c3b1295c3e8f-kube-api-access-lgrxw\") pod \"cert-manager-webhook-597b96b99b-5njvw\" (UID: \"c41c29c4-1653-42ee-a908-c3b1295c3e8f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:00.345409 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.345409 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c41c29c4-1653-42ee-a908-c3b1295c3e8f-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-5njvw\" (UID: \"c41c29c4-1653-42ee-a908-c3b1295c3e8f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:00.446322 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.446284 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgrxw\" (UniqueName: \"kubernetes.io/projected/c41c29c4-1653-42ee-a908-c3b1295c3e8f-kube-api-access-lgrxw\") pod \"cert-manager-webhook-597b96b99b-5njvw\" (UID: \"c41c29c4-1653-42ee-a908-c3b1295c3e8f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:00.446322 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.446325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c41c29c4-1653-42ee-a908-c3b1295c3e8f-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-5njvw\" (UID: \"c41c29c4-1653-42ee-a908-c3b1295c3e8f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:00.456573 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.456545 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgrxw\" (UniqueName: \"kubernetes.io/projected/c41c29c4-1653-42ee-a908-c3b1295c3e8f-kube-api-access-lgrxw\") pod \"cert-manager-webhook-597b96b99b-5njvw\" (UID: \"c41c29c4-1653-42ee-a908-c3b1295c3e8f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:00.457899 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.457882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c41c29c4-1653-42ee-a908-c3b1295c3e8f-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-5njvw\" (UID: \"c41c29c4-1653-42ee-a908-c3b1295c3e8f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:00.564708 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.564604 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:00.684654 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.684620 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-5njvw"] Apr 16 14:07:00.688383 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:07:00.688355 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc41c29c4_1653_42ee_a908_c3b1295c3e8f.slice/crio-eb67173a3394abcbe0a8f02b5c070fa6cd36fa7194516546f4d24bba658fed13 WatchSource:0}: Error finding container eb67173a3394abcbe0a8f02b5c070fa6cd36fa7194516546f4d24bba658fed13: Status 404 returned error can't find the container with id eb67173a3394abcbe0a8f02b5c070fa6cd36fa7194516546f4d24bba658fed13 Apr 16 14:07:00.690178 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:00.690160 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:07:01.251578 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:01.251544 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" event={"ID":"c41c29c4-1653-42ee-a908-c3b1295c3e8f","Type":"ContainerStarted","Data":"eb67173a3394abcbe0a8f02b5c070fa6cd36fa7194516546f4d24bba658fed13"} Apr 16 14:07:05.265318 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:05.265284 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" event={"ID":"c41c29c4-1653-42ee-a908-c3b1295c3e8f","Type":"ContainerStarted","Data":"5d8c48c01114550798c992c4b9385d720d36a97f683e3424090685e40ceba9d4"} Apr 16 14:07:05.265681 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:05.265340 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:05.287927 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:05.287884 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" podStartSLOduration=1.720212989 podStartE2EDuration="5.28787243s" podCreationTimestamp="2026-04-16 14:07:00 +0000 UTC" firstStartedPulling="2026-04-16 14:07:00.690321027 +0000 UTC m=+467.319689287" lastFinishedPulling="2026-04-16 14:07:04.257980468 +0000 UTC m=+470.887348728" observedRunningTime="2026-04-16 14:07:05.286832454 +0000 UTC m=+471.916200734" watchObservedRunningTime="2026-04-16 14:07:05.28787243 +0000 UTC m=+471.917240706" Apr 16 14:07:11.269392 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:11.269362 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-5njvw" Apr 16 14:07:14.291880 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.291848 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc"] Apr 16 14:07:14.295641 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.295618 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" Apr 16 14:07:14.298603 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.298586 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 14:07:14.299609 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.299594 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:07:14.299674 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.299616 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-bq87k\"" Apr 16 14:07:14.303722 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.303699 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc"] Apr 16 14:07:14.454961 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.454921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt9dr\" (UniqueName: \"kubernetes.io/projected/9bbcefce-670d-4d32-a732-52ef39512af3-kube-api-access-gt9dr\") pod \"openshift-lws-operator-bfc7f696d-x6qvc\" (UID: \"9bbcefce-670d-4d32-a732-52ef39512af3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" Apr 16 14:07:14.455180 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.454986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9bbcefce-670d-4d32-a732-52ef39512af3-tmp\") pod \"openshift-lws-operator-bfc7f696d-x6qvc\" (UID: \"9bbcefce-670d-4d32-a732-52ef39512af3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" Apr 16 14:07:14.556343 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.556254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt9dr\" (UniqueName: \"kubernetes.io/projected/9bbcefce-670d-4d32-a732-52ef39512af3-kube-api-access-gt9dr\") pod \"openshift-lws-operator-bfc7f696d-x6qvc\" (UID: \"9bbcefce-670d-4d32-a732-52ef39512af3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" Apr 16 14:07:14.556343 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.556309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9bbcefce-670d-4d32-a732-52ef39512af3-tmp\") pod \"openshift-lws-operator-bfc7f696d-x6qvc\" (UID: \"9bbcefce-670d-4d32-a732-52ef39512af3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" Apr 16 14:07:14.556664 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.556646 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9bbcefce-670d-4d32-a732-52ef39512af3-tmp\") pod \"openshift-lws-operator-bfc7f696d-x6qvc\" (UID: \"9bbcefce-670d-4d32-a732-52ef39512af3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" Apr 16 14:07:14.575418 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.575387 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt9dr\" (UniqueName: \"kubernetes.io/projected/9bbcefce-670d-4d32-a732-52ef39512af3-kube-api-access-gt9dr\") pod \"openshift-lws-operator-bfc7f696d-x6qvc\" (UID: \"9bbcefce-670d-4d32-a732-52ef39512af3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" Apr 16 14:07:14.605229 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.605191 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" Apr 16 14:07:14.728781 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:14.728732 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc"] Apr 16 14:07:14.731891 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:07:14.731858 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bbcefce_670d_4d32_a732_52ef39512af3.slice/crio-fe6627b566450266108440741b8643c080a6cf671fd7ee2038ed15719e03817f WatchSource:0}: Error finding container fe6627b566450266108440741b8643c080a6cf671fd7ee2038ed15719e03817f: Status 404 returned error can't find the container with id fe6627b566450266108440741b8643c080a6cf671fd7ee2038ed15719e03817f Apr 16 14:07:15.291622 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:15.291582 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" event={"ID":"9bbcefce-670d-4d32-a732-52ef39512af3","Type":"ContainerStarted","Data":"fe6627b566450266108440741b8643c080a6cf671fd7ee2038ed15719e03817f"} Apr 16 14:07:17.299189 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:17.299138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" event={"ID":"9bbcefce-670d-4d32-a732-52ef39512af3","Type":"ContainerStarted","Data":"14bed768b981f4641775890d47aa87c8c4ac45e57f99b3a5cb1bd83438d02275"} Apr 16 14:07:17.316126 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:17.316057 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-x6qvc" podStartSLOduration=0.910015424 podStartE2EDuration="3.316043469s" podCreationTimestamp="2026-04-16 14:07:14 +0000 UTC" firstStartedPulling="2026-04-16 14:07:14.733307289 +0000 UTC m=+481.362675547" lastFinishedPulling="2026-04-16 14:07:17.139335331 +0000 UTC m=+483.768703592" observedRunningTime="2026-04-16 14:07:17.315138848 +0000 UTC m=+483.944507127" watchObservedRunningTime="2026-04-16 14:07:17.316043469 +0000 UTC m=+483.945411747" Apr 16 14:07:19.230344 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.230310 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-mmh2j"] Apr 16 14:07:19.233404 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.233387 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-mmh2j" Apr 16 14:07:19.235965 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.235928 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-l49s7\"" Apr 16 14:07:19.240751 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.240730 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-mmh2j"] Apr 16 14:07:19.391127 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.391090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77hf6\" (UniqueName: \"kubernetes.io/projected/92f539a6-02e6-44b9-b4ff-c36382cfac36-kube-api-access-77hf6\") pod \"cert-manager-759f64656b-mmh2j\" (UID: \"92f539a6-02e6-44b9-b4ff-c36382cfac36\") " pod="cert-manager/cert-manager-759f64656b-mmh2j" Apr 16 14:07:19.391314 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.391156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f539a6-02e6-44b9-b4ff-c36382cfac36-bound-sa-token\") pod \"cert-manager-759f64656b-mmh2j\" (UID: \"92f539a6-02e6-44b9-b4ff-c36382cfac36\") " pod="cert-manager/cert-manager-759f64656b-mmh2j" Apr 16 14:07:19.491902 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.491814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77hf6\" (UniqueName: \"kubernetes.io/projected/92f539a6-02e6-44b9-b4ff-c36382cfac36-kube-api-access-77hf6\") pod \"cert-manager-759f64656b-mmh2j\" (UID: \"92f539a6-02e6-44b9-b4ff-c36382cfac36\") " pod="cert-manager/cert-manager-759f64656b-mmh2j" Apr 16 14:07:19.491902 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.491877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f539a6-02e6-44b9-b4ff-c36382cfac36-bound-sa-token\") pod \"cert-manager-759f64656b-mmh2j\" (UID: \"92f539a6-02e6-44b9-b4ff-c36382cfac36\") " pod="cert-manager/cert-manager-759f64656b-mmh2j" Apr 16 14:07:19.499947 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.499916 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f539a6-02e6-44b9-b4ff-c36382cfac36-bound-sa-token\") pod \"cert-manager-759f64656b-mmh2j\" (UID: \"92f539a6-02e6-44b9-b4ff-c36382cfac36\") " pod="cert-manager/cert-manager-759f64656b-mmh2j" Apr 16 14:07:19.500316 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.500298 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77hf6\" (UniqueName: \"kubernetes.io/projected/92f539a6-02e6-44b9-b4ff-c36382cfac36-kube-api-access-77hf6\") pod \"cert-manager-759f64656b-mmh2j\" (UID: \"92f539a6-02e6-44b9-b4ff-c36382cfac36\") " pod="cert-manager/cert-manager-759f64656b-mmh2j" Apr 16 14:07:19.542301 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.542264 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-mmh2j" Apr 16 14:07:19.659529 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:19.659494 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-mmh2j"] Apr 16 14:07:19.662604 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:07:19.662577 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92f539a6_02e6_44b9_b4ff_c36382cfac36.slice/crio-dcc9a41907df89417e860c76b4e8884a56efda7fdca7d63e5ad72f6d831dd5bc WatchSource:0}: Error finding container dcc9a41907df89417e860c76b4e8884a56efda7fdca7d63e5ad72f6d831dd5bc: Status 404 returned error can't find the container with id dcc9a41907df89417e860c76b4e8884a56efda7fdca7d63e5ad72f6d831dd5bc Apr 16 14:07:20.307021 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:20.306986 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-mmh2j" event={"ID":"92f539a6-02e6-44b9-b4ff-c36382cfac36","Type":"ContainerStarted","Data":"fb28b8794f1a391763e68e38a6f2b9f4e7bfc40ffad9bcc48958ca33ab801001"} Apr 16 14:07:20.307021 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:20.307021 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-mmh2j" event={"ID":"92f539a6-02e6-44b9-b4ff-c36382cfac36","Type":"ContainerStarted","Data":"dcc9a41907df89417e860c76b4e8884a56efda7fdca7d63e5ad72f6d831dd5bc"} Apr 16 14:07:20.326187 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:20.326141 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-mmh2j" podStartSLOduration=1.326126995 podStartE2EDuration="1.326126995s" podCreationTimestamp="2026-04-16 14:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:07:20.325285402 +0000 UTC m=+486.954653682" watchObservedRunningTime="2026-04-16 14:07:20.326126995 +0000 UTC m=+486.955495273" Apr 16 14:07:46.861398 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.861362 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-4rttq"] Apr 16 14:07:46.868513 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.868490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:07:46.872529 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.872497 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 14:07:46.873055 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.873034 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-6992q\"" Apr 16 14:07:46.873681 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.873660 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 14:07:46.884648 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.884608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/447f5c24-b308-44c6-8d56-51a16d29e134-operator-config\") pod \"servicemesh-operator3-55f49c5f94-4rttq\" (UID: \"447f5c24-b308-44c6-8d56-51a16d29e134\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:07:46.884799 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.884685 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz8dc\" (UniqueName: \"kubernetes.io/projected/447f5c24-b308-44c6-8d56-51a16d29e134-kube-api-access-jz8dc\") pod \"servicemesh-operator3-55f49c5f94-4rttq\" (UID: \"447f5c24-b308-44c6-8d56-51a16d29e134\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:07:46.892732 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.892704 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-4rttq"] Apr 16 14:07:46.985550 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.985509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jz8dc\" (UniqueName: \"kubernetes.io/projected/447f5c24-b308-44c6-8d56-51a16d29e134-kube-api-access-jz8dc\") pod \"servicemesh-operator3-55f49c5f94-4rttq\" (UID: \"447f5c24-b308-44c6-8d56-51a16d29e134\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:07:46.985761 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.985578 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/447f5c24-b308-44c6-8d56-51a16d29e134-operator-config\") pod \"servicemesh-operator3-55f49c5f94-4rttq\" (UID: \"447f5c24-b308-44c6-8d56-51a16d29e134\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:07:46.988286 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.988259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/447f5c24-b308-44c6-8d56-51a16d29e134-operator-config\") pod \"servicemesh-operator3-55f49c5f94-4rttq\" (UID: \"447f5c24-b308-44c6-8d56-51a16d29e134\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:07:46.996914 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:46.996877 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz8dc\" (UniqueName: \"kubernetes.io/projected/447f5c24-b308-44c6-8d56-51a16d29e134-kube-api-access-jz8dc\") pod \"servicemesh-operator3-55f49c5f94-4rttq\" (UID: \"447f5c24-b308-44c6-8d56-51a16d29e134\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:07:47.178313 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:47.178271 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:07:47.302823 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:47.302800 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-4rttq"] Apr 16 14:07:47.305145 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:07:47.305113 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod447f5c24_b308_44c6_8d56_51a16d29e134.slice/crio-e47a8e141abb711e9f7a7202ffb9af0362acc663172ba09941fda66d30a56ad0 WatchSource:0}: Error finding container e47a8e141abb711e9f7a7202ffb9af0362acc663172ba09941fda66d30a56ad0: Status 404 returned error can't find the container with id e47a8e141abb711e9f7a7202ffb9af0362acc663172ba09941fda66d30a56ad0 Apr 16 14:07:47.381041 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:47.381012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" event={"ID":"447f5c24-b308-44c6-8d56-51a16d29e134","Type":"ContainerStarted","Data":"e47a8e141abb711e9f7a7202ffb9af0362acc663172ba09941fda66d30a56ad0"} Apr 16 14:07:52.399344 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:52.399302 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" event={"ID":"447f5c24-b308-44c6-8d56-51a16d29e134","Type":"ContainerStarted","Data":"37c73af613dedfb37bd0f715fb9024c2147ae5ae827b1d56bcbb32b8d68cfd37"} Apr 16 14:07:52.399795 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:52.399457 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:07:52.428093 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:52.428001 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" podStartSLOduration=2.337398081 podStartE2EDuration="6.42798607s" podCreationTimestamp="2026-04-16 14:07:46 +0000 UTC" firstStartedPulling="2026-04-16 14:07:47.307683583 +0000 UTC m=+513.937051844" lastFinishedPulling="2026-04-16 14:07:51.398271576 +0000 UTC m=+518.027639833" observedRunningTime="2026-04-16 14:07:52.426464307 +0000 UTC m=+519.055832585" watchObservedRunningTime="2026-04-16 14:07:52.42798607 +0000 UTC m=+519.057354349" Apr 16 14:07:54.772379 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.772322 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s"] Apr 16 14:07:54.775608 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.775588 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.778519 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.778499 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:07:54.778641 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.778502 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 14:07:54.778963 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.778943 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 14:07:54.778963 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.778958 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 14:07:54.779160 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.779054 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 14:07:54.779580 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.779561 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:07:54.779666 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.779562 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-rbjsf\"" Apr 16 14:07:54.788766 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.788746 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s"] Apr 16 14:07:54.840897 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.840869 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvttd\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-kube-api-access-zvttd\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.840897 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.840903 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.841116 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.840929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.841116 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.840960 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.841116 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.840984 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2c395-a40e-46f0-b904-f9799fee34c3-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.841116 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.841003 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.841116 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.841024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.941392 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.941354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.941392 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.941392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2c395-a40e-46f0-b904-f9799fee34c3-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.941607 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.941410 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.941607 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.941444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.941607 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.941465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvttd\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-kube-api-access-zvttd\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.941607 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.941485 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.941607 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.941524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.942326 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.942298 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.943844 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.943822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2c395-a40e-46f0-b904-f9799fee34c3-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.943925 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.943856 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.944247 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.944222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.944345 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.944246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.954180 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.954152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvttd\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-kube-api-access-zvttd\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:54.954414 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:54.954375 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-sls9s\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:55.085555 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:55.085462 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:55.216478 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:55.216445 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s"] Apr 16 14:07:55.219304 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:07:55.219271 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6e2c395_a40e_46f0_b904_f9799fee34c3.slice/crio-74cd32e8153376245a3c95649181d6b4da00d430c67715b7a4abd5cf56dd20e4 WatchSource:0}: Error finding container 74cd32e8153376245a3c95649181d6b4da00d430c67715b7a4abd5cf56dd20e4: Status 404 returned error can't find the container with id 74cd32e8153376245a3c95649181d6b4da00d430c67715b7a4abd5cf56dd20e4 Apr 16 14:07:55.410703 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:55.410663 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" event={"ID":"b6e2c395-a40e-46f0-b904-f9799fee34c3","Type":"ContainerStarted","Data":"74cd32e8153376245a3c95649181d6b4da00d430c67715b7a4abd5cf56dd20e4"} Apr 16 14:07:58.284364 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:58.284324 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 14:07:58.284647 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:58.284393 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 14:07:58.420757 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:58.420724 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" event={"ID":"b6e2c395-a40e-46f0-b904-f9799fee34c3","Type":"ContainerStarted","Data":"ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671"} Apr 16 14:07:58.420915 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:58.420864 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:07:58.443085 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:58.443024 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" podStartSLOduration=1.380106506 podStartE2EDuration="4.44300856s" podCreationTimestamp="2026-04-16 14:07:54 +0000 UTC" firstStartedPulling="2026-04-16 14:07:55.221207384 +0000 UTC m=+521.850575641" lastFinishedPulling="2026-04-16 14:07:58.284109437 +0000 UTC m=+524.913477695" observedRunningTime="2026-04-16 14:07:58.441534554 +0000 UTC m=+525.070902846" watchObservedRunningTime="2026-04-16 14:07:58.44300856 +0000 UTC m=+525.072376839" Apr 16 14:07:59.426311 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:07:59.426281 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:08:03.404084 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:03.403982 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-4rttq" Apr 16 14:08:24.457447 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.457407 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv"] Apr 16 14:08:24.460521 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.460504 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:24.466223 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.466198 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-f8zzn\"" Apr 16 14:08:24.466959 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.466938 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:08:24.467257 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.467243 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:08:24.484480 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.484452 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv"] Apr 16 14:08:24.575058 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.575023 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/17b6a0f3-a787-402a-b143-10829b107975-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f7hwv\" (UID: \"17b6a0f3-a787-402a-b143-10829b107975\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:24.575255 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.575091 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5g89\" (UniqueName: \"kubernetes.io/projected/17b6a0f3-a787-402a-b143-10829b107975-kube-api-access-q5g89\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f7hwv\" (UID: \"17b6a0f3-a787-402a-b143-10829b107975\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:24.676294 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.676252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/17b6a0f3-a787-402a-b143-10829b107975-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f7hwv\" (UID: \"17b6a0f3-a787-402a-b143-10829b107975\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:24.676499 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.676298 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5g89\" (UniqueName: \"kubernetes.io/projected/17b6a0f3-a787-402a-b143-10829b107975-kube-api-access-q5g89\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f7hwv\" (UID: \"17b6a0f3-a787-402a-b143-10829b107975\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:24.676722 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.676700 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/17b6a0f3-a787-402a-b143-10829b107975-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f7hwv\" (UID: \"17b6a0f3-a787-402a-b143-10829b107975\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:24.685045 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.685021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5g89\" (UniqueName: \"kubernetes.io/projected/17b6a0f3-a787-402a-b143-10829b107975-kube-api-access-q5g89\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f7hwv\" (UID: \"17b6a0f3-a787-402a-b143-10829b107975\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:24.770323 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.770229 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:24.896498 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:24.896465 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv"] Apr 16 14:08:24.899947 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:08:24.899919 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17b6a0f3_a787_402a_b143_10829b107975.slice/crio-1c655aef5b555a5fc70eae66c84e58e14005d440674469566b20b3cbb19aa94d WatchSource:0}: Error finding container 1c655aef5b555a5fc70eae66c84e58e14005d440674469566b20b3cbb19aa94d: Status 404 returned error can't find the container with id 1c655aef5b555a5fc70eae66c84e58e14005d440674469566b20b3cbb19aa94d Apr 16 14:08:25.506267 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:25.506233 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" event={"ID":"17b6a0f3-a787-402a-b143-10829b107975","Type":"ContainerStarted","Data":"1c655aef5b555a5fc70eae66c84e58e14005d440674469566b20b3cbb19aa94d"} Apr 16 14:08:26.414999 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:26.414886 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln"] Apr 16 14:08:26.420161 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:26.418436 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" Apr 16 14:08:26.421461 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:26.421435 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-vs9gx\"" Apr 16 14:08:26.425766 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:26.425740 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln"] Apr 16 14:08:26.491090 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:26.490985 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmz88\" (UniqueName: \"kubernetes.io/projected/dc0bbac0-13c6-4a05-9912-955257610f3d-kube-api-access-zmz88\") pod \"limitador-operator-controller-manager-c7fb4c8d5-b5dln\" (UID: \"dc0bbac0-13c6-4a05-9912-955257610f3d\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" Apr 16 14:08:26.592088 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:26.592041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmz88\" (UniqueName: \"kubernetes.io/projected/dc0bbac0-13c6-4a05-9912-955257610f3d-kube-api-access-zmz88\") pod \"limitador-operator-controller-manager-c7fb4c8d5-b5dln\" (UID: \"dc0bbac0-13c6-4a05-9912-955257610f3d\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" Apr 16 14:08:26.607054 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:26.607013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmz88\" (UniqueName: \"kubernetes.io/projected/dc0bbac0-13c6-4a05-9912-955257610f3d-kube-api-access-zmz88\") pod \"limitador-operator-controller-manager-c7fb4c8d5-b5dln\" (UID: \"dc0bbac0-13c6-4a05-9912-955257610f3d\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" Apr 16 14:08:26.732922 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:26.732799 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" Apr 16 14:08:26.897466 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:26.897400 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln"] Apr 16 14:08:26.901841 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:08:26.901787 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc0bbac0_13c6_4a05_9912_955257610f3d.slice/crio-6c8106bbbcb929c8d76eed468c5fa80239c318452865ce7ebeaf29b4b85cfcee WatchSource:0}: Error finding container 6c8106bbbcb929c8d76eed468c5fa80239c318452865ce7ebeaf29b4b85cfcee: Status 404 returned error can't find the container with id 6c8106bbbcb929c8d76eed468c5fa80239c318452865ce7ebeaf29b4b85cfcee Apr 16 14:08:27.514581 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:27.514541 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" event={"ID":"dc0bbac0-13c6-4a05-9912-955257610f3d","Type":"ContainerStarted","Data":"6c8106bbbcb929c8d76eed468c5fa80239c318452865ce7ebeaf29b4b85cfcee"} Apr 16 14:08:31.529689 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:31.529647 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" event={"ID":"17b6a0f3-a787-402a-b143-10829b107975","Type":"ContainerStarted","Data":"7c0beec5e36d09bfedb12ca71dbd81829a04d291b6b7b9d6834b7eaad517f8af"} Apr 16 14:08:31.530151 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:31.529765 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:31.530977 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:31.530957 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" event={"ID":"dc0bbac0-13c6-4a05-9912-955257610f3d","Type":"ContainerStarted","Data":"7c00b4246b66f33085bf49913a6b7fa006058adbcec2629a5c8598190b24c04b"} Apr 16 14:08:31.531100 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:31.531042 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" Apr 16 14:08:31.553253 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:31.553126 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" podStartSLOduration=1.509625743 podStartE2EDuration="7.553110599s" podCreationTimestamp="2026-04-16 14:08:24 +0000 UTC" firstStartedPulling="2026-04-16 14:08:24.90279416 +0000 UTC m=+551.532162416" lastFinishedPulling="2026-04-16 14:08:30.946279008 +0000 UTC m=+557.575647272" observedRunningTime="2026-04-16 14:08:31.551383939 +0000 UTC m=+558.180752220" watchObservedRunningTime="2026-04-16 14:08:31.553110599 +0000 UTC m=+558.182478876" Apr 16 14:08:31.568203 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:31.568156 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" podStartSLOduration=1.516972933 podStartE2EDuration="5.568142604s" podCreationTimestamp="2026-04-16 14:08:26 +0000 UTC" firstStartedPulling="2026-04-16 14:08:26.904462411 +0000 UTC m=+553.533830668" lastFinishedPulling="2026-04-16 14:08:30.955632064 +0000 UTC m=+557.585000339" observedRunningTime="2026-04-16 14:08:31.567283957 +0000 UTC m=+558.196652236" watchObservedRunningTime="2026-04-16 14:08:31.568142604 +0000 UTC m=+558.197510882" Apr 16 14:08:42.536518 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:42.536485 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f7hwv" Apr 16 14:08:42.536897 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:08:42.536548 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-b5dln" Apr 16 14:09:13.881471 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:13.881442 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:09:13.881940 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:13.881696 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:09:14.128825 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.128788 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2xkbw"] Apr 16 14:09:14.132617 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.132557 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:14.135159 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.135136 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5hx64\"" Apr 16 14:09:14.135374 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.135358 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 14:09:14.141159 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.141138 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2xkbw"] Apr 16 14:09:14.149961 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.149932 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b8668a1-1704-42a3-ad56-0bee07e101b8-config-file\") pod \"limitador-limitador-64c8f475fb-2xkbw\" (UID: \"2b8668a1-1704-42a3-ad56-0bee07e101b8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:14.150248 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.150202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plbkm\" (UniqueName: \"kubernetes.io/projected/2b8668a1-1704-42a3-ad56-0bee07e101b8-kube-api-access-plbkm\") pod \"limitador-limitador-64c8f475fb-2xkbw\" (UID: \"2b8668a1-1704-42a3-ad56-0bee07e101b8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:14.230596 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.230564 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2xkbw"] Apr 16 14:09:14.251086 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.251037 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b8668a1-1704-42a3-ad56-0bee07e101b8-config-file\") pod \"limitador-limitador-64c8f475fb-2xkbw\" (UID: \"2b8668a1-1704-42a3-ad56-0bee07e101b8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:14.251234 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.251099 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plbkm\" (UniqueName: \"kubernetes.io/projected/2b8668a1-1704-42a3-ad56-0bee07e101b8-kube-api-access-plbkm\") pod \"limitador-limitador-64c8f475fb-2xkbw\" (UID: \"2b8668a1-1704-42a3-ad56-0bee07e101b8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:14.251631 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.251610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b8668a1-1704-42a3-ad56-0bee07e101b8-config-file\") pod \"limitador-limitador-64c8f475fb-2xkbw\" (UID: \"2b8668a1-1704-42a3-ad56-0bee07e101b8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:14.263525 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.263492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plbkm\" (UniqueName: \"kubernetes.io/projected/2b8668a1-1704-42a3-ad56-0bee07e101b8-kube-api-access-plbkm\") pod \"limitador-limitador-64c8f475fb-2xkbw\" (UID: \"2b8668a1-1704-42a3-ad56-0bee07e101b8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:14.443757 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.443666 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:14.567536 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.567501 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2xkbw"] Apr 16 14:09:14.571537 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:09:14.571507 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8668a1_1704_42a3_ad56_0bee07e101b8.slice/crio-8a82b197360e8855d803ea44583e3337a87265fd4fa80eef64ca9a0f109852e5 WatchSource:0}: Error finding container 8a82b197360e8855d803ea44583e3337a87265fd4fa80eef64ca9a0f109852e5: Status 404 returned error can't find the container with id 8a82b197360e8855d803ea44583e3337a87265fd4fa80eef64ca9a0f109852e5 Apr 16 14:09:14.665146 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:14.665107 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" event={"ID":"2b8668a1-1704-42a3-ad56-0bee07e101b8","Type":"ContainerStarted","Data":"8a82b197360e8855d803ea44583e3337a87265fd4fa80eef64ca9a0f109852e5"} Apr 16 14:09:18.680387 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:18.680353 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" event={"ID":"2b8668a1-1704-42a3-ad56-0bee07e101b8","Type":"ContainerStarted","Data":"3365ca8ec9befcbc2cb23c35c8ce9adbb01e1e550dc4a7f745c021fe6f2dfb96"} Apr 16 14:09:18.680873 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:18.680484 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:29.684853 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:29.684821 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:29.700789 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:29.700728 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" podStartSLOduration=11.752640574 podStartE2EDuration="15.700710367s" podCreationTimestamp="2026-04-16 14:09:14 +0000 UTC" firstStartedPulling="2026-04-16 14:09:14.573759562 +0000 UTC m=+601.203127833" lastFinishedPulling="2026-04-16 14:09:18.521829366 +0000 UTC m=+605.151197626" observedRunningTime="2026-04-16 14:09:18.698419677 +0000 UTC m=+605.327787957" watchObservedRunningTime="2026-04-16 14:09:29.700710367 +0000 UTC m=+616.330078647" Apr 16 14:09:30.222025 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:30.221985 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2xkbw"] Apr 16 14:09:30.222242 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:30.222219 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" podUID="2b8668a1-1704-42a3-ad56-0bee07e101b8" containerName="limitador" containerID="cri-o://3365ca8ec9befcbc2cb23c35c8ce9adbb01e1e550dc4a7f745c021fe6f2dfb96" gracePeriod=30 Apr 16 14:09:30.717487 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:30.717452 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b8668a1-1704-42a3-ad56-0bee07e101b8" containerID="3365ca8ec9befcbc2cb23c35c8ce9adbb01e1e550dc4a7f745c021fe6f2dfb96" exitCode=0 Apr 16 14:09:30.717487 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:30.717490 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" event={"ID":"2b8668a1-1704-42a3-ad56-0bee07e101b8","Type":"ContainerDied","Data":"3365ca8ec9befcbc2cb23c35c8ce9adbb01e1e550dc4a7f745c021fe6f2dfb96"} Apr 16 14:09:31.160031 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.160008 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:31.182668 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.182628 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plbkm\" (UniqueName: \"kubernetes.io/projected/2b8668a1-1704-42a3-ad56-0bee07e101b8-kube-api-access-plbkm\") pod \"2b8668a1-1704-42a3-ad56-0bee07e101b8\" (UID: \"2b8668a1-1704-42a3-ad56-0bee07e101b8\") " Apr 16 14:09:31.182819 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.182712 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b8668a1-1704-42a3-ad56-0bee07e101b8-config-file\") pod \"2b8668a1-1704-42a3-ad56-0bee07e101b8\" (UID: \"2b8668a1-1704-42a3-ad56-0bee07e101b8\") " Apr 16 14:09:31.183091 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.183048 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8668a1-1704-42a3-ad56-0bee07e101b8-config-file" (OuterVolumeSpecName: "config-file") pod "2b8668a1-1704-42a3-ad56-0bee07e101b8" (UID: "2b8668a1-1704-42a3-ad56-0bee07e101b8"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:09:31.184921 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.184888 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8668a1-1704-42a3-ad56-0bee07e101b8-kube-api-access-plbkm" (OuterVolumeSpecName: "kube-api-access-plbkm") pod "2b8668a1-1704-42a3-ad56-0bee07e101b8" (UID: "2b8668a1-1704-42a3-ad56-0bee07e101b8"). InnerVolumeSpecName "kube-api-access-plbkm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:31.283374 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.283335 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-plbkm\" (UniqueName: \"kubernetes.io/projected/2b8668a1-1704-42a3-ad56-0bee07e101b8-kube-api-access-plbkm\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:09:31.283374 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.283369 2571 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b8668a1-1704-42a3-ad56-0bee07e101b8-config-file\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:09:31.721522 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.721486 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" event={"ID":"2b8668a1-1704-42a3-ad56-0bee07e101b8","Type":"ContainerDied","Data":"8a82b197360e8855d803ea44583e3337a87265fd4fa80eef64ca9a0f109852e5"} Apr 16 14:09:31.721522 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.721503 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-2xkbw" Apr 16 14:09:31.722050 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.721536 2571 scope.go:117] "RemoveContainer" containerID="3365ca8ec9befcbc2cb23c35c8ce9adbb01e1e550dc4a7f745c021fe6f2dfb96" Apr 16 14:09:31.742227 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.742197 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2xkbw"] Apr 16 14:09:31.746032 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.746005 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-2xkbw"] Apr 16 14:09:31.955480 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:31.955447 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8668a1-1704-42a3-ad56-0bee07e101b8" path="/var/lib/kubelet/pods/2b8668a1-1704-42a3-ad56-0bee07e101b8/volumes" Apr 16 14:09:49.495964 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.495925 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz"] Apr 16 14:09:49.496484 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.496314 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b8668a1-1704-42a3-ad56-0bee07e101b8" containerName="limitador" Apr 16 14:09:49.496484 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.496332 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8668a1-1704-42a3-ad56-0bee07e101b8" containerName="limitador" Apr 16 14:09:49.496484 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.496401 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b8668a1-1704-42a3-ad56-0bee07e101b8" containerName="limitador" Apr 16 14:09:49.499053 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.499031 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.511208 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.511181 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz"] Apr 16 14:09:49.630319 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.630286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/cf7faa71-231f-4467-a789-acd6da492013-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.630503 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.630327 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cf7faa71-231f-4467-a789-acd6da492013-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.630503 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.630351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cf7faa71-231f-4467-a789-acd6da492013-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.630503 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.630374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/cf7faa71-231f-4467-a789-acd6da492013-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.630503 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.630454 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq89g\" (UniqueName: \"kubernetes.io/projected/cf7faa71-231f-4467-a789-acd6da492013-kube-api-access-zq89g\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.630503 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.630501 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/cf7faa71-231f-4467-a789-acd6da492013-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.630686 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.630586 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/cf7faa71-231f-4467-a789-acd6da492013-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.731560 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.731526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/cf7faa71-231f-4467-a789-acd6da492013-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.731753 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.731573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/cf7faa71-231f-4467-a789-acd6da492013-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.731753 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.731607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cf7faa71-231f-4467-a789-acd6da492013-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.731753 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.731712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cf7faa71-231f-4467-a789-acd6da492013-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.731753 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.731749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/cf7faa71-231f-4467-a789-acd6da492013-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.731970 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.731776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq89g\" (UniqueName: \"kubernetes.io/projected/cf7faa71-231f-4467-a789-acd6da492013-kube-api-access-zq89g\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.731970 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.731807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/cf7faa71-231f-4467-a789-acd6da492013-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.732484 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.732457 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/cf7faa71-231f-4467-a789-acd6da492013-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.734085 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.734051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/cf7faa71-231f-4467-a789-acd6da492013-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.734174 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.734111 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/cf7faa71-231f-4467-a789-acd6da492013-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.734216 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.734176 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cf7faa71-231f-4467-a789-acd6da492013-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.734216 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.734211 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/cf7faa71-231f-4467-a789-acd6da492013-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.740782 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.740760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cf7faa71-231f-4467-a789-acd6da492013-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.742266 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.742248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq89g\" (UniqueName: \"kubernetes.io/projected/cf7faa71-231f-4467-a789-acd6da492013-kube-api-access-zq89g\") pod \"istiod-openshift-gateway-55ff986f96-mdhsz\" (UID: \"cf7faa71-231f-4467-a789-acd6da492013\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.809065 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.808964 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:49.943526 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.943425 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz"] Apr 16 14:09:49.959892 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:09:49.948095 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf7faa71_231f_4467_a789_acd6da492013.slice/crio-41f378eb0a0003f57e18eeac756a81cacfbf711eb3f7d336bca67dd126b52415 WatchSource:0}: Error finding container 41f378eb0a0003f57e18eeac756a81cacfbf711eb3f7d336bca67dd126b52415: Status 404 returned error can't find the container with id 41f378eb0a0003f57e18eeac756a81cacfbf711eb3f7d336bca67dd126b52415 Apr 16 14:09:49.959892 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.953874 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 14:09:49.959892 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:49.953931 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 14:09:50.782644 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:50.782600 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" event={"ID":"cf7faa71-231f-4467-a789-acd6da492013","Type":"ContainerStarted","Data":"32f3a87c212afcca20af4bc88e9f95774ef6c6e86ee48dcdf7b0d03b679e6550"} Apr 16 14:09:50.782644 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:50.782646 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" event={"ID":"cf7faa71-231f-4467-a789-acd6da492013","Type":"ContainerStarted","Data":"41f378eb0a0003f57e18eeac756a81cacfbf711eb3f7d336bca67dd126b52415"} Apr 16 14:09:50.783057 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:50.782671 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:50.810298 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:50.810251 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" podStartSLOduration=1.810234319 podStartE2EDuration="1.810234319s" podCreationTimestamp="2026-04-16 14:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:09:50.808544199 +0000 UTC m=+637.437912478" watchObservedRunningTime="2026-04-16 14:09:50.810234319 +0000 UTC m=+637.439602598" Apr 16 14:09:51.787862 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:51.787837 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mdhsz" Apr 16 14:09:51.862577 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:51.862544 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s"] Apr 16 14:09:51.862859 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:51.862829 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" podUID="b6e2c395-a40e-46f0-b904-f9799fee34c3" containerName="discovery" containerID="cri-o://ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671" gracePeriod=30 Apr 16 14:09:52.108177 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.108154 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:09:52.151990 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.151940 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-dns-cert\") pod \"b6e2c395-a40e-46f0-b904-f9799fee34c3\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " Apr 16 14:09:52.152184 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.152016 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-cacerts\") pod \"b6e2c395-a40e-46f0-b904-f9799fee34c3\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " Apr 16 14:09:52.152184 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.152048 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-token\") pod \"b6e2c395-a40e-46f0-b904-f9799fee34c3\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " Apr 16 14:09:52.152184 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.152122 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-ca-configmap\") pod \"b6e2c395-a40e-46f0-b904-f9799fee34c3\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " Apr 16 14:09:52.152184 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.152164 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvttd\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-kube-api-access-zvttd\") pod \"b6e2c395-a40e-46f0-b904-f9799fee34c3\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " Apr 16 14:09:52.152401 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.152234 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2c395-a40e-46f0-b904-f9799fee34c3-local-certs\") pod \"b6e2c395-a40e-46f0-b904-f9799fee34c3\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " Apr 16 14:09:52.152401 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.152277 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-kubeconfig\") pod \"b6e2c395-a40e-46f0-b904-f9799fee34c3\" (UID: \"b6e2c395-a40e-46f0-b904-f9799fee34c3\") " Apr 16 14:09:52.152735 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.152691 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "b6e2c395-a40e-46f0-b904-f9799fee34c3" (UID: "b6e2c395-a40e-46f0-b904-f9799fee34c3"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:09:52.154753 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.154706 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e2c395-a40e-46f0-b904-f9799fee34c3-local-certs" (OuterVolumeSpecName: "local-certs") pod "b6e2c395-a40e-46f0-b904-f9799fee34c3" (UID: "b6e2c395-a40e-46f0-b904-f9799fee34c3"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:09:52.154753 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.154711 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "b6e2c395-a40e-46f0-b904-f9799fee34c3" (UID: "b6e2c395-a40e-46f0-b904-f9799fee34c3"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:09:52.154924 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.154778 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-token" (OuterVolumeSpecName: "istio-token") pod "b6e2c395-a40e-46f0-b904-f9799fee34c3" (UID: "b6e2c395-a40e-46f0-b904-f9799fee34c3"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:52.154924 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.154866 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-cacerts" (OuterVolumeSpecName: "cacerts") pod "b6e2c395-a40e-46f0-b904-f9799fee34c3" (UID: "b6e2c395-a40e-46f0-b904-f9799fee34c3"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:09:52.154924 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.154882 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "b6e2c395-a40e-46f0-b904-f9799fee34c3" (UID: "b6e2c395-a40e-46f0-b904-f9799fee34c3"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:09:52.154924 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.154889 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-kube-api-access-zvttd" (OuterVolumeSpecName: "kube-api-access-zvttd") pod "b6e2c395-a40e-46f0-b904-f9799fee34c3" (UID: "b6e2c395-a40e-46f0-b904-f9799fee34c3"). InnerVolumeSpecName "kube-api-access-zvttd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:52.257356 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.257318 2571 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-kubeconfig\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:09:52.257356 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.257359 2571 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-dns-cert\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:09:52.257561 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.257373 2571 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b6e2c395-a40e-46f0-b904-f9799fee34c3-cacerts\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:09:52.257561 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.257383 2571 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-token\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:09:52.257561 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.257393 2571 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b6e2c395-a40e-46f0-b904-f9799fee34c3-istio-csr-ca-configmap\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:09:52.257561 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.257405 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvttd\" (UniqueName: \"kubernetes.io/projected/b6e2c395-a40e-46f0-b904-f9799fee34c3-kube-api-access-zvttd\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:09:52.257561 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.257418 2571 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b6e2c395-a40e-46f0-b904-f9799fee34c3-local-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:09:52.789905 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.789870 2571 generic.go:358] "Generic (PLEG): container finished" podID="b6e2c395-a40e-46f0-b904-f9799fee34c3" containerID="ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671" exitCode=0 Apr 16 14:09:52.790341 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.789955 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" Apr 16 14:09:52.790341 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.789958 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" event={"ID":"b6e2c395-a40e-46f0-b904-f9799fee34c3","Type":"ContainerDied","Data":"ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671"} Apr 16 14:09:52.790341 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.790001 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s" event={"ID":"b6e2c395-a40e-46f0-b904-f9799fee34c3","Type":"ContainerDied","Data":"74cd32e8153376245a3c95649181d6b4da00d430c67715b7a4abd5cf56dd20e4"} Apr 16 14:09:52.790341 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.790022 2571 scope.go:117] "RemoveContainer" containerID="ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671" Apr 16 14:09:52.799159 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.799144 2571 scope.go:117] "RemoveContainer" containerID="ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671" Apr 16 14:09:52.799443 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:09:52.799422 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671\": container with ID starting with ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671 not found: ID does not exist" containerID="ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671" Apr 16 14:09:52.799522 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.799452 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671"} err="failed to get container status \"ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671\": rpc error: code = NotFound desc = could not find container \"ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671\": container with ID starting with ce1a8393e95111f30607cbb54d329b78c18498a10793da0285ed65058dc89671 not found: ID does not exist" Apr 16 14:09:52.816343 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.816308 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s"] Apr 16 14:09:52.823252 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:52.823226 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-sls9s"] Apr 16 14:09:53.955867 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:53.955836 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e2c395-a40e-46f0-b904-f9799fee34c3" path="/var/lib/kubelet/pods/b6e2c395-a40e-46f0-b904-f9799fee34c3/volumes" Apr 16 14:09:58.290166 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.290130 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-74pmz"] Apr 16 14:09:58.290519 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.290418 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6e2c395-a40e-46f0-b904-f9799fee34c3" containerName="discovery" Apr 16 14:09:58.290519 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.290430 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e2c395-a40e-46f0-b904-f9799fee34c3" containerName="discovery" Apr 16 14:09:58.290519 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.290488 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6e2c395-a40e-46f0-b904-f9799fee34c3" containerName="discovery" Apr 16 14:09:58.294782 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.294758 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:09:58.297443 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.297413 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 14:09:58.297581 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.297497 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:09:58.297581 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.297422 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:09:58.297830 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.297810 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-vdqvj\"" Apr 16 14:09:58.302878 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.302854 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-74pmz"] Apr 16 14:09:58.305987 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.305964 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-546c567d96-qn5bg"] Apr 16 14:09:58.308948 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.308931 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:09:58.311798 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.311775 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 14:09:58.311959 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.311941 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-m7xd8\"" Apr 16 14:09:58.321943 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.321913 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-546c567d96-qn5bg"] Apr 16 14:09:58.404785 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.404752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/758ef89b-4a19-4796-89f7-a0652dad0b85-cert\") pod \"kserve-controller-manager-9bbf58456-74pmz\" (UID: \"758ef89b-4a19-4796-89f7-a0652dad0b85\") " pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:09:58.404942 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.404793 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c2961ae-97dc-4272-8934-95e91e621b8d-cert\") pod \"llmisvc-controller-manager-546c567d96-qn5bg\" (UID: \"9c2961ae-97dc-4272-8934-95e91e621b8d\") " pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:09:58.404942 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.404828 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r29ww\" (UniqueName: \"kubernetes.io/projected/9c2961ae-97dc-4272-8934-95e91e621b8d-kube-api-access-r29ww\") pod \"llmisvc-controller-manager-546c567d96-qn5bg\" (UID: \"9c2961ae-97dc-4272-8934-95e91e621b8d\") " pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:09:58.404942 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.404891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhw9n\" (UniqueName: \"kubernetes.io/projected/758ef89b-4a19-4796-89f7-a0652dad0b85-kube-api-access-vhw9n\") pod \"kserve-controller-manager-9bbf58456-74pmz\" (UID: \"758ef89b-4a19-4796-89f7-a0652dad0b85\") " pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:09:58.506255 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.506215 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/758ef89b-4a19-4796-89f7-a0652dad0b85-cert\") pod \"kserve-controller-manager-9bbf58456-74pmz\" (UID: \"758ef89b-4a19-4796-89f7-a0652dad0b85\") " pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:09:58.506255 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.506251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c2961ae-97dc-4272-8934-95e91e621b8d-cert\") pod \"llmisvc-controller-manager-546c567d96-qn5bg\" (UID: \"9c2961ae-97dc-4272-8934-95e91e621b8d\") " pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:09:58.506504 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.506280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r29ww\" (UniqueName: \"kubernetes.io/projected/9c2961ae-97dc-4272-8934-95e91e621b8d-kube-api-access-r29ww\") pod \"llmisvc-controller-manager-546c567d96-qn5bg\" (UID: \"9c2961ae-97dc-4272-8934-95e91e621b8d\") " pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:09:58.506504 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.506331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhw9n\" (UniqueName: \"kubernetes.io/projected/758ef89b-4a19-4796-89f7-a0652dad0b85-kube-api-access-vhw9n\") pod \"kserve-controller-manager-9bbf58456-74pmz\" (UID: \"758ef89b-4a19-4796-89f7-a0652dad0b85\") " pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:09:58.508714 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.508686 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/758ef89b-4a19-4796-89f7-a0652dad0b85-cert\") pod \"kserve-controller-manager-9bbf58456-74pmz\" (UID: \"758ef89b-4a19-4796-89f7-a0652dad0b85\") " pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:09:58.508714 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.508711 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c2961ae-97dc-4272-8934-95e91e621b8d-cert\") pod \"llmisvc-controller-manager-546c567d96-qn5bg\" (UID: \"9c2961ae-97dc-4272-8934-95e91e621b8d\") " pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:09:58.519752 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.519728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r29ww\" (UniqueName: \"kubernetes.io/projected/9c2961ae-97dc-4272-8934-95e91e621b8d-kube-api-access-r29ww\") pod \"llmisvc-controller-manager-546c567d96-qn5bg\" (UID: \"9c2961ae-97dc-4272-8934-95e91e621b8d\") " pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:09:58.530900 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.530872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhw9n\" (UniqueName: \"kubernetes.io/projected/758ef89b-4a19-4796-89f7-a0652dad0b85-kube-api-access-vhw9n\") pod \"kserve-controller-manager-9bbf58456-74pmz\" (UID: \"758ef89b-4a19-4796-89f7-a0652dad0b85\") " pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:09:58.605945 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.605844 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:09:58.618856 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.618829 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:09:58.735437 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.735370 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-74pmz"] Apr 16 14:09:58.740718 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:09:58.740689 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod758ef89b_4a19_4796_89f7_a0652dad0b85.slice/crio-e8a7a837f319a4f9693ca3c607e9eacfddb87bcd993e8b60a8058e43f7c583c1 WatchSource:0}: Error finding container e8a7a837f319a4f9693ca3c607e9eacfddb87bcd993e8b60a8058e43f7c583c1: Status 404 returned error can't find the container with id e8a7a837f319a4f9693ca3c607e9eacfddb87bcd993e8b60a8058e43f7c583c1 Apr 16 14:09:58.762931 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.762891 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-546c567d96-qn5bg"] Apr 16 14:09:58.765503 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:09:58.765477 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9c2961ae_97dc_4272_8934_95e91e621b8d.slice/crio-97b33595a032b4f63e9e306ad9d4b46388c9f72233ababda4f497ac6dedb80de WatchSource:0}: Error finding container 97b33595a032b4f63e9e306ad9d4b46388c9f72233ababda4f497ac6dedb80de: Status 404 returned error can't find the container with id 97b33595a032b4f63e9e306ad9d4b46388c9f72233ababda4f497ac6dedb80de Apr 16 14:09:58.812086 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.812032 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" event={"ID":"9c2961ae-97dc-4272-8934-95e91e621b8d","Type":"ContainerStarted","Data":"97b33595a032b4f63e9e306ad9d4b46388c9f72233ababda4f497ac6dedb80de"} Apr 16 14:09:58.813018 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:09:58.812996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" event={"ID":"758ef89b-4a19-4796-89f7-a0652dad0b85","Type":"ContainerStarted","Data":"e8a7a837f319a4f9693ca3c607e9eacfddb87bcd993e8b60a8058e43f7c583c1"} Apr 16 14:10:02.831724 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:02.831688 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" event={"ID":"9c2961ae-97dc-4272-8934-95e91e621b8d","Type":"ContainerStarted","Data":"a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93"} Apr 16 14:10:02.832202 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:02.831783 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:10:02.833096 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:02.833061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" event={"ID":"758ef89b-4a19-4796-89f7-a0652dad0b85","Type":"ContainerStarted","Data":"c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95"} Apr 16 14:10:02.833195 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:02.833153 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:10:02.850553 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:02.850505 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" podStartSLOduration=1.013036558 podStartE2EDuration="4.850490831s" podCreationTimestamp="2026-04-16 14:09:58 +0000 UTC" firstStartedPulling="2026-04-16 14:09:58.76679185 +0000 UTC m=+645.396160107" lastFinishedPulling="2026-04-16 14:10:02.60424612 +0000 UTC m=+649.233614380" observedRunningTime="2026-04-16 14:10:02.848470377 +0000 UTC m=+649.477838671" watchObservedRunningTime="2026-04-16 14:10:02.850490831 +0000 UTC m=+649.479859100" Apr 16 14:10:02.865486 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:02.865394 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" podStartSLOduration=1.7304690169999999 podStartE2EDuration="4.865380357s" podCreationTimestamp="2026-04-16 14:09:58 +0000 UTC" firstStartedPulling="2026-04-16 14:09:58.74199336 +0000 UTC m=+645.371361616" lastFinishedPulling="2026-04-16 14:10:01.876904691 +0000 UTC m=+648.506272956" observedRunningTime="2026-04-16 14:10:02.864186942 +0000 UTC m=+649.493555258" watchObservedRunningTime="2026-04-16 14:10:02.865380357 +0000 UTC m=+649.494748637" Apr 16 14:10:33.838753 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:33.838716 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:10:33.841716 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:33.841694 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:10:35.145732 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.145701 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-74pmz"] Apr 16 14:10:35.146201 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.145954 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" podUID="758ef89b-4a19-4796-89f7-a0652dad0b85" containerName="manager" containerID="cri-o://c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95" gracePeriod=10 Apr 16 14:10:35.167862 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.167836 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-xsttp"] Apr 16 14:10:35.169702 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.169688 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:10:35.178659 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.178628 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-xsttp"] Apr 16 14:10:35.191417 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.191381 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20cec3cb-f572-4622-8557-b5b5b2ce90e1-cert\") pod \"kserve-controller-manager-9bbf58456-xsttp\" (UID: \"20cec3cb-f572-4622-8557-b5b5b2ce90e1\") " pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:10:35.191598 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.191429 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnfkc\" (UniqueName: \"kubernetes.io/projected/20cec3cb-f572-4622-8557-b5b5b2ce90e1-kube-api-access-hnfkc\") pod \"kserve-controller-manager-9bbf58456-xsttp\" (UID: \"20cec3cb-f572-4622-8557-b5b5b2ce90e1\") " pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:10:35.292319 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.292286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20cec3cb-f572-4622-8557-b5b5b2ce90e1-cert\") pod \"kserve-controller-manager-9bbf58456-xsttp\" (UID: \"20cec3cb-f572-4622-8557-b5b5b2ce90e1\") " pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:10:35.292520 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.292345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnfkc\" (UniqueName: \"kubernetes.io/projected/20cec3cb-f572-4622-8557-b5b5b2ce90e1-kube-api-access-hnfkc\") pod \"kserve-controller-manager-9bbf58456-xsttp\" (UID: \"20cec3cb-f572-4622-8557-b5b5b2ce90e1\") " pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:10:35.294827 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.294798 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20cec3cb-f572-4622-8557-b5b5b2ce90e1-cert\") pod \"kserve-controller-manager-9bbf58456-xsttp\" (UID: \"20cec3cb-f572-4622-8557-b5b5b2ce90e1\") " pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:10:35.301660 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.301630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnfkc\" (UniqueName: \"kubernetes.io/projected/20cec3cb-f572-4622-8557-b5b5b2ce90e1-kube-api-access-hnfkc\") pod \"kserve-controller-manager-9bbf58456-xsttp\" (UID: \"20cec3cb-f572-4622-8557-b5b5b2ce90e1\") " pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:10:35.379765 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.379742 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:10:35.393456 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.393429 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/758ef89b-4a19-4796-89f7-a0652dad0b85-cert\") pod \"758ef89b-4a19-4796-89f7-a0652dad0b85\" (UID: \"758ef89b-4a19-4796-89f7-a0652dad0b85\") " Apr 16 14:10:35.393590 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.393468 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhw9n\" (UniqueName: \"kubernetes.io/projected/758ef89b-4a19-4796-89f7-a0652dad0b85-kube-api-access-vhw9n\") pod \"758ef89b-4a19-4796-89f7-a0652dad0b85\" (UID: \"758ef89b-4a19-4796-89f7-a0652dad0b85\") " Apr 16 14:10:35.395704 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.395673 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758ef89b-4a19-4796-89f7-a0652dad0b85-cert" (OuterVolumeSpecName: "cert") pod "758ef89b-4a19-4796-89f7-a0652dad0b85" (UID: "758ef89b-4a19-4796-89f7-a0652dad0b85"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:10:35.395829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.395684 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/758ef89b-4a19-4796-89f7-a0652dad0b85-kube-api-access-vhw9n" (OuterVolumeSpecName: "kube-api-access-vhw9n") pod "758ef89b-4a19-4796-89f7-a0652dad0b85" (UID: "758ef89b-4a19-4796-89f7-a0652dad0b85"). InnerVolumeSpecName "kube-api-access-vhw9n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:10:35.493897 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.493864 2571 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/758ef89b-4a19-4796-89f7-a0652dad0b85-cert\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:10:35.493897 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.493892 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vhw9n\" (UniqueName: \"kubernetes.io/projected/758ef89b-4a19-4796-89f7-a0652dad0b85-kube-api-access-vhw9n\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:10:35.524050 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.524000 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:10:35.646937 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.646857 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-xsttp"] Apr 16 14:10:35.650065 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:10:35.650037 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20cec3cb_f572_4622_8557_b5b5b2ce90e1.slice/crio-40de24a56c033be04040c7866349617d32882d0defecaf213cd0646507faa408 WatchSource:0}: Error finding container 40de24a56c033be04040c7866349617d32882d0defecaf213cd0646507faa408: Status 404 returned error can't find the container with id 40de24a56c033be04040c7866349617d32882d0defecaf213cd0646507faa408 Apr 16 14:10:35.945726 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.945634 2571 generic.go:358] "Generic (PLEG): container finished" podID="758ef89b-4a19-4796-89f7-a0652dad0b85" containerID="c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95" exitCode=0 Apr 16 14:10:35.945726 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.945707 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" Apr 16 14:10:35.945726 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.945713 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" event={"ID":"758ef89b-4a19-4796-89f7-a0652dad0b85","Type":"ContainerDied","Data":"c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95"} Apr 16 14:10:35.945950 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.945748 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-74pmz" event={"ID":"758ef89b-4a19-4796-89f7-a0652dad0b85","Type":"ContainerDied","Data":"e8a7a837f319a4f9693ca3c607e9eacfddb87bcd993e8b60a8058e43f7c583c1"} Apr 16 14:10:35.945950 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.945770 2571 scope.go:117] "RemoveContainer" containerID="c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95" Apr 16 14:10:35.946976 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.946955 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-xsttp" event={"ID":"20cec3cb-f572-4622-8557-b5b5b2ce90e1","Type":"ContainerStarted","Data":"40de24a56c033be04040c7866349617d32882d0defecaf213cd0646507faa408"} Apr 16 14:10:35.953841 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.953791 2571 scope.go:117] "RemoveContainer" containerID="c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95" Apr 16 14:10:35.954147 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:10:35.954116 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95\": container with ID starting with c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95 not found: ID does not exist" containerID="c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95" Apr 16 14:10:35.954252 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.954158 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95"} err="failed to get container status \"c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95\": rpc error: code = NotFound desc = could not find container \"c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95\": container with ID starting with c60171ff78426ae17f129239ff7314a811b2294af6e746035b77545a29751d95 not found: ID does not exist" Apr 16 14:10:35.966720 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.966686 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-74pmz"] Apr 16 14:10:35.973734 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:35.973702 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-74pmz"] Apr 16 14:10:36.952321 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:36.952291 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-xsttp" event={"ID":"20cec3cb-f572-4622-8557-b5b5b2ce90e1","Type":"ContainerStarted","Data":"e234dd8d72571bbb558f18d995ec601d3cae48e09c7f5b9c81b22bacc2bd6563"} Apr 16 14:10:36.952321 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:36.952337 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:10:36.969474 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:36.969413 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-9bbf58456-xsttp" podStartSLOduration=1.626848927 podStartE2EDuration="1.969398632s" podCreationTimestamp="2026-04-16 14:10:35 +0000 UTC" firstStartedPulling="2026-04-16 14:10:35.651312144 +0000 UTC m=+682.280680401" lastFinishedPulling="2026-04-16 14:10:35.993861829 +0000 UTC m=+682.623230106" observedRunningTime="2026-04-16 14:10:36.968092242 +0000 UTC m=+683.597460513" watchObservedRunningTime="2026-04-16 14:10:36.969398632 +0000 UTC m=+683.598766913" Apr 16 14:10:37.960447 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:10:37.960409 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="758ef89b-4a19-4796-89f7-a0652dad0b85" path="/var/lib/kubelet/pods/758ef89b-4a19-4796-89f7-a0652dad0b85/volumes" Apr 16 14:11:07.966061 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:11:07.965978 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-9bbf58456-xsttp" Apr 16 14:12:08.395901 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.395853 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx"] Apr 16 14:12:08.396495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.396191 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="758ef89b-4a19-4796-89f7-a0652dad0b85" containerName="manager" Apr 16 14:12:08.396495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.396203 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="758ef89b-4a19-4796-89f7-a0652dad0b85" containerName="manager" Apr 16 14:12:08.396495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.396270 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="758ef89b-4a19-4796-89f7-a0652dad0b85" containerName="manager" Apr 16 14:12:08.399409 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.399382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.402595 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.402565 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:12:08.402595 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.402585 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:12:08.403048 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.403029 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 14:12:08.403602 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.403582 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-4mdx4\"" Apr 16 14:12:08.403602 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.403594 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:12:08.412047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.412005 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx"] Apr 16 14:12:08.498790 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.498755 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.498790 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.498800 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.499047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.498850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7m76\" (UniqueName: \"kubernetes.io/projected/ddd3be3a-0b59-427a-b63a-169396e5f53a-kube-api-access-n7m76\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.499047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.498887 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd3be3a-0b59-427a-b63a-169396e5f53a-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.499047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.498947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.499047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.499012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.599863 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.599822 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd3be3a-0b59-427a-b63a-169396e5f53a-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.600093 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.599886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.600093 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.599933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.600093 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.599967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.600093 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.599993 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.600093 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.600038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7m76\" (UniqueName: \"kubernetes.io/projected/ddd3be3a-0b59-427a-b63a-169396e5f53a-kube-api-access-n7m76\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.600457 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.600430 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.600510 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.600452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.600510 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.600493 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.600576 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.600523 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.602575 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.602553 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd3be3a-0b59-427a-b63a-169396e5f53a-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.609823 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.609785 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7m76\" (UniqueName: \"kubernetes.io/projected/ddd3be3a-0b59-427a-b63a-169396e5f53a-kube-api-access-n7m76\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.712243 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.712133 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:08.848235 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.848172 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx"] Apr 16 14:12:08.852771 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:12:08.852728 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd3be3a_0b59_427a_b63a_169396e5f53a.slice/crio-f48d9d76597a1a68288022a19d4e0271c63988b3ef23950a0df0bbc67055b8ed WatchSource:0}: Error finding container f48d9d76597a1a68288022a19d4e0271c63988b3ef23950a0df0bbc67055b8ed: Status 404 returned error can't find the container with id f48d9d76597a1a68288022a19d4e0271c63988b3ef23950a0df0bbc67055b8ed Apr 16 14:12:08.854987 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:08.854964 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:12:09.252755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:09.252715 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" event={"ID":"ddd3be3a-0b59-427a-b63a-169396e5f53a","Type":"ContainerStarted","Data":"f48d9d76597a1a68288022a19d4e0271c63988b3ef23950a0df0bbc67055b8ed"} Apr 16 14:12:13.271533 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:13.271489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" event={"ID":"ddd3be3a-0b59-427a-b63a-169396e5f53a","Type":"ContainerStarted","Data":"0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e"} Apr 16 14:12:14.275767 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:14.275731 2571 generic.go:358] "Generic (PLEG): container finished" podID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerID="0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e" exitCode=0 Apr 16 14:12:14.275767 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:14.275775 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" event={"ID":"ddd3be3a-0b59-427a-b63a-169396e5f53a","Type":"ContainerDied","Data":"0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e"} Apr 16 14:12:16.287032 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:16.286957 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" event={"ID":"ddd3be3a-0b59-427a-b63a-169396e5f53a","Type":"ContainerStarted","Data":"d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b"} Apr 16 14:12:46.405146 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:46.405108 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" event={"ID":"ddd3be3a-0b59-427a-b63a-169396e5f53a","Type":"ContainerStarted","Data":"2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5"} Apr 16 14:12:46.405599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:46.405277 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:46.408054 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:46.408029 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:46.429447 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:46.429399 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" podStartSLOduration=1.327296921 podStartE2EDuration="38.429362554s" podCreationTimestamp="2026-04-16 14:12:08 +0000 UTC" firstStartedPulling="2026-04-16 14:12:08.855116243 +0000 UTC m=+775.484484500" lastFinishedPulling="2026-04-16 14:12:45.957181874 +0000 UTC m=+812.586550133" observedRunningTime="2026-04-16 14:12:46.427959442 +0000 UTC m=+813.057327735" watchObservedRunningTime="2026-04-16 14:12:46.429362554 +0000 UTC m=+813.058730845" Apr 16 14:12:48.713117 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:48.713065 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:48.713117 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:48.713120 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:58.713958 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:58.713924 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:12:58.715262 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:12:58.715241 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:13:12.814702 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:12.814660 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx"] Apr 16 14:13:12.815476 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:12.815399 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="main" containerID="cri-o://d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b" gracePeriod=30 Apr 16 14:13:12.815843 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:12.815817 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="tokenizer" containerID="cri-o://2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5" gracePeriod=30 Apr 16 14:13:13.496631 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:13.496539 2571 generic.go:358] "Generic (PLEG): container finished" podID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerID="d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b" exitCode=0 Apr 16 14:13:13.496631 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:13.496616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" event={"ID":"ddd3be3a-0b59-427a-b63a-169396e5f53a","Type":"ContainerDied","Data":"d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b"} Apr 16 14:13:14.303594 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.303565 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:13:14.443809 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.443773 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7m76\" (UniqueName: \"kubernetes.io/projected/ddd3be3a-0b59-427a-b63a-169396e5f53a-kube-api-access-n7m76\") pod \"ddd3be3a-0b59-427a-b63a-169396e5f53a\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " Apr 16 14:13:14.443974 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.443826 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-kserve-provision-location\") pod \"ddd3be3a-0b59-427a-b63a-169396e5f53a\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " Apr 16 14:13:14.444015 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.443964 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-tmp\") pod \"ddd3be3a-0b59-427a-b63a-169396e5f53a\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " Apr 16 14:13:14.444063 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.444031 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-cache\") pod \"ddd3be3a-0b59-427a-b63a-169396e5f53a\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " Apr 16 14:13:14.444135 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.444083 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd3be3a-0b59-427a-b63a-169396e5f53a-tls-certs\") pod \"ddd3be3a-0b59-427a-b63a-169396e5f53a\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " Apr 16 14:13:14.444135 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.444131 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-uds\") pod \"ddd3be3a-0b59-427a-b63a-169396e5f53a\" (UID: \"ddd3be3a-0b59-427a-b63a-169396e5f53a\") " Apr 16 14:13:14.444309 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.444282 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ddd3be3a-0b59-427a-b63a-169396e5f53a" (UID: "ddd3be3a-0b59-427a-b63a-169396e5f53a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:14.444399 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.444374 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ddd3be3a-0b59-427a-b63a-169396e5f53a" (UID: "ddd3be3a-0b59-427a-b63a-169396e5f53a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:14.444399 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.444387 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:13:14.444550 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.444523 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ddd3be3a-0b59-427a-b63a-169396e5f53a" (UID: "ddd3be3a-0b59-427a-b63a-169396e5f53a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:14.444830 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.444802 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ddd3be3a-0b59-427a-b63a-169396e5f53a" (UID: "ddd3be3a-0b59-427a-b63a-169396e5f53a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:14.446282 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.446247 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd3be3a-0b59-427a-b63a-169396e5f53a-kube-api-access-n7m76" (OuterVolumeSpecName: "kube-api-access-n7m76") pod "ddd3be3a-0b59-427a-b63a-169396e5f53a" (UID: "ddd3be3a-0b59-427a-b63a-169396e5f53a"). InnerVolumeSpecName "kube-api-access-n7m76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:13:14.446394 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.446325 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd3be3a-0b59-427a-b63a-169396e5f53a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ddd3be3a-0b59-427a-b63a-169396e5f53a" (UID: "ddd3be3a-0b59-427a-b63a-169396e5f53a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:13:14.507564 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.507524 2571 generic.go:358] "Generic (PLEG): container finished" podID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerID="2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5" exitCode=0 Apr 16 14:13:14.507734 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.507607 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" event={"ID":"ddd3be3a-0b59-427a-b63a-169396e5f53a","Type":"ContainerDied","Data":"2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5"} Apr 16 14:13:14.507734 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.507628 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" Apr 16 14:13:14.507734 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.507653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx" event={"ID":"ddd3be3a-0b59-427a-b63a-169396e5f53a","Type":"ContainerDied","Data":"f48d9d76597a1a68288022a19d4e0271c63988b3ef23950a0df0bbc67055b8ed"} Apr 16 14:13:14.507734 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.507671 2571 scope.go:117] "RemoveContainer" containerID="2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5" Apr 16 14:13:14.516576 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.516555 2571 scope.go:117] "RemoveContainer" containerID="d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b" Apr 16 14:13:14.524047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.524025 2571 scope.go:117] "RemoveContainer" containerID="0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e" Apr 16 14:13:14.530149 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.530123 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx"] Apr 16 14:13:14.533006 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.532949 2571 scope.go:117] "RemoveContainer" containerID="2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5" Apr 16 14:13:14.533567 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:13:14.533535 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5\": container with ID starting with 2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5 not found: ID does not exist" containerID="2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5" Apr 16 14:13:14.533701 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.533583 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5"} err="failed to get container status \"2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5\": rpc error: code = NotFound desc = could not find container \"2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5\": container with ID starting with 2731e1c8f5274ae47350d2ddd5cdb44887dc7c1a9f6282ecee16de964958c0d5 not found: ID does not exist" Apr 16 14:13:14.533701 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.533610 2571 scope.go:117] "RemoveContainer" containerID="d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b" Apr 16 14:13:14.533909 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:13:14.533886 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b\": container with ID starting with d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b not found: ID does not exist" containerID="d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b" Apr 16 14:13:14.534000 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.533912 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b"} err="failed to get container status \"d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b\": rpc error: code = NotFound desc = could not find container \"d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b\": container with ID starting with d3df06feb9ccd4ab7945b122885a5ac66758e5f09597a48b7e9a1e80d68e767b not found: ID does not exist" Apr 16 14:13:14.534000 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.533929 2571 scope.go:117] "RemoveContainer" containerID="0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e" Apr 16 14:13:14.534372 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:13:14.534347 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e\": container with ID starting with 0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e not found: ID does not exist" containerID="0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e" Apr 16 14:13:14.534443 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.534380 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e"} err="failed to get container status \"0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e\": rpc error: code = NotFound desc = could not find container \"0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e\": container with ID starting with 0bf8d7568b4edf600e228e57f46f2208a83789461822324021c180acd1e8fc7e not found: ID does not exist" Apr 16 14:13:14.534811 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.534792 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-55695hn9sx"] Apr 16 14:13:14.545797 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.545760 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-tmp\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:13:14.545797 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.545791 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd3be3a-0b59-427a-b63a-169396e5f53a-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:13:14.545797 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.545800 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-tokenizer-uds\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:13:14.546022 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.545809 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7m76\" (UniqueName: \"kubernetes.io/projected/ddd3be3a-0b59-427a-b63a-169396e5f53a-kube-api-access-n7m76\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:13:14.546022 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:14.545820 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd3be3a-0b59-427a-b63a-169396e5f53a-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:13:15.955013 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:15.954977 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" path="/var/lib/kubelet/pods/ddd3be3a-0b59-427a-b63a-169396e5f53a/volumes" Apr 16 14:13:18.986791 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:18.986757 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c"] Apr 16 14:13:18.987294 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:18.987061 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="storage-initializer" Apr 16 14:13:18.987294 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:18.987089 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="storage-initializer" Apr 16 14:13:18.987294 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:18.987101 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="main" Apr 16 14:13:18.987294 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:18.987107 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="main" Apr 16 14:13:18.987294 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:18.987118 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="tokenizer" Apr 16 14:13:18.987294 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:18.987123 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="tokenizer" Apr 16 14:13:18.987294 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:18.987191 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="tokenizer" Apr 16 14:13:18.987294 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:18.987199 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddd3be3a-0b59-427a-b63a-169396e5f53a" containerName="main" Apr 16 14:13:19.095775 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.095734 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c"] Apr 16 14:13:19.095934 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.095867 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.099160 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.099121 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:13:19.100321 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.100296 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 14:13:19.100321 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.100321 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:13:19.100518 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.100298 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:13:19.179303 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.179266 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-model-cache\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.179303 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.179312 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.179526 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.179330 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-home\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.179526 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.179374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.179526 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.179418 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-dshm\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.179526 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.179446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26qtg\" (UniqueName: \"kubernetes.io/projected/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kube-api-access-26qtg\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.276975 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.276894 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8"] Apr 16 14:13:19.280061 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.280029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-model-cache\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.280253 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.280097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.280253 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.280121 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-home\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.280253 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.280147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.280253 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.280189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-dshm\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.280253 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.280231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26qtg\" (UniqueName: \"kubernetes.io/projected/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kube-api-access-26qtg\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.280545 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.280488 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-model-cache\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.280545 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.280526 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-home\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.280659 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.280564 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.282464 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.282436 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-dshm\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.282705 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.282689 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.298228 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.298196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26qtg\" (UniqueName: \"kubernetes.io/projected/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kube-api-access-26qtg\") pod \"scheduler-ha-replicas-test-kserve-76597456c7-j8s9c\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.298532 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.298509 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8"] Apr 16 14:13:19.298659 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.298648 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.301326 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.301298 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-xnnkk\"" Apr 16 14:13:19.381313 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.381275 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.381495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.381339 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.381495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.381375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.381495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.381398 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.381495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.381426 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8jbp\" (UniqueName: \"kubernetes.io/projected/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kube-api-access-j8jbp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.381625 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.381495 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.407065 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.407025 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:19.481961 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.481921 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.482174 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.482093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.482220 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.482172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.482220 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.482210 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.482309 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.482240 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.482309 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.482266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8jbp\" (UniqueName: \"kubernetes.io/projected/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kube-api-access-j8jbp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.482462 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.482410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.482524 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.482506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.482609 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.482572 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.482735 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.482716 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.484814 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.484788 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.490802 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.490776 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8jbp\" (UniqueName: \"kubernetes.io/projected/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kube-api-access-j8jbp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.548436 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.548340 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c"] Apr 16 14:13:19.551474 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:13:19.551447 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd445f4fb_f9ad_4896_a9cc_3a1b37e3638b.slice/crio-060b9452e97b2f810829f8661bc8b66a461a60462934835b3e415a6cc5b34a38 WatchSource:0}: Error finding container 060b9452e97b2f810829f8661bc8b66a461a60462934835b3e415a6cc5b34a38: Status 404 returned error can't find the container with id 060b9452e97b2f810829f8661bc8b66a461a60462934835b3e415a6cc5b34a38 Apr 16 14:13:19.614833 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.614797 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:19.751512 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:19.751476 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8"] Apr 16 14:13:19.755349 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:13:19.755318 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd29ea6cb_7bb5_42a9_86bd_5342fd88ccf2.slice/crio-2c1e3c9f58fa48a5b6984a51005e72c599c21e72dd438037c1e2d3a5f3be5624 WatchSource:0}: Error finding container 2c1e3c9f58fa48a5b6984a51005e72c599c21e72dd438037c1e2d3a5f3be5624: Status 404 returned error can't find the container with id 2c1e3c9f58fa48a5b6984a51005e72c599c21e72dd438037c1e2d3a5f3be5624 Apr 16 14:13:20.529090 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:20.529041 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" event={"ID":"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2","Type":"ContainerStarted","Data":"55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2"} Apr 16 14:13:20.529090 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:20.529097 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" event={"ID":"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2","Type":"ContainerStarted","Data":"2c1e3c9f58fa48a5b6984a51005e72c599c21e72dd438037c1e2d3a5f3be5624"} Apr 16 14:13:20.530449 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:20.530419 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" event={"ID":"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b","Type":"ContainerStarted","Data":"36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265"} Apr 16 14:13:20.530449 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:20.530451 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" event={"ID":"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b","Type":"ContainerStarted","Data":"060b9452e97b2f810829f8661bc8b66a461a60462934835b3e415a6cc5b34a38"} Apr 16 14:13:21.535222 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:21.535180 2571 generic.go:358] "Generic (PLEG): container finished" podID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerID="55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2" exitCode=0 Apr 16 14:13:21.535606 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:21.535297 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" event={"ID":"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2","Type":"ContainerDied","Data":"55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2"} Apr 16 14:13:22.540760 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:22.540720 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" event={"ID":"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2","Type":"ContainerStarted","Data":"2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc"} Apr 16 14:13:22.541275 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:22.540769 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" event={"ID":"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2","Type":"ContainerStarted","Data":"7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03"} Apr 16 14:13:22.541275 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:22.540858 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:22.563726 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:22.563661 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" podStartSLOduration=3.563640833 podStartE2EDuration="3.563640833s" podCreationTimestamp="2026-04-16 14:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:13:22.562061654 +0000 UTC m=+849.191429930" watchObservedRunningTime="2026-04-16 14:13:22.563640833 +0000 UTC m=+849.193009113" Apr 16 14:13:24.549100 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:24.549049 2571 generic.go:358] "Generic (PLEG): container finished" podID="d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" containerID="36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265" exitCode=0 Apr 16 14:13:24.549565 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:24.549129 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" event={"ID":"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b","Type":"ContainerDied","Data":"36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265"} Apr 16 14:13:26.560528 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:26.560488 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" event={"ID":"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b","Type":"ContainerStarted","Data":"4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167"} Apr 16 14:13:26.580607 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:26.580547 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" podStartSLOduration=7.504192638 podStartE2EDuration="8.580529872s" podCreationTimestamp="2026-04-16 14:13:18 +0000 UTC" firstStartedPulling="2026-04-16 14:13:24.550364257 +0000 UTC m=+851.179732514" lastFinishedPulling="2026-04-16 14:13:25.626701476 +0000 UTC m=+852.256069748" observedRunningTime="2026-04-16 14:13:26.579119805 +0000 UTC m=+853.208488089" watchObservedRunningTime="2026-04-16 14:13:26.580529872 +0000 UTC m=+853.209898151" Apr 16 14:13:29.408195 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:29.408149 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:29.408621 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:29.408209 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:29.421037 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:29.421010 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:29.583772 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:29.583737 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:13:29.615546 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:29.615509 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:29.615787 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:29.615764 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:29.617129 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:29.617086 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.27:8082/healthz\": dial tcp 10.132.0.27:8082: connect: connection refused" Apr 16 14:13:39.617136 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:39.617107 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:13:39.618464 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:13:39.618443 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:14:00.611686 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:00.611652 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:14:01.633056 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:01.633010 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c"] Apr 16 14:14:01.633465 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:01.633329 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" podUID="d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" containerName="main" containerID="cri-o://4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167" gracePeriod=30 Apr 16 14:14:01.643006 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:01.642967 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8"] Apr 16 14:14:01.643404 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:01.643370 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="main" containerID="cri-o://7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03" gracePeriod=30 Apr 16 14:14:01.644062 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:01.643576 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="tokenizer" containerID="cri-o://2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc" gracePeriod=30 Apr 16 14:14:01.889109 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:01.889025 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:14:02.055333 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.055253 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kserve-provision-location\") pod \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " Apr 16 14:14:02.055496 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.055357 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-model-cache\") pod \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " Apr 16 14:14:02.055496 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.055382 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26qtg\" (UniqueName: \"kubernetes.io/projected/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kube-api-access-26qtg\") pod \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " Apr 16 14:14:02.055496 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.055409 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-tls-certs\") pod \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " Apr 16 14:14:02.055496 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.055438 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-dshm\") pod \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " Apr 16 14:14:02.055496 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.055466 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-home\") pod \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\" (UID: \"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b\") " Apr 16 14:14:02.055759 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.055622 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-model-cache" (OuterVolumeSpecName: "model-cache") pod "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" (UID: "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:02.055885 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.055841 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-home" (OuterVolumeSpecName: "home") pod "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" (UID: "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:02.058197 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.058157 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-dshm" (OuterVolumeSpecName: "dshm") pod "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" (UID: "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:02.058336 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.058320 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" (UID: "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:14:02.058423 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.058397 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kube-api-access-26qtg" (OuterVolumeSpecName: "kube-api-access-26qtg") pod "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" (UID: "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b"). InnerVolumeSpecName "kube-api-access-26qtg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:14:02.111990 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.111943 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" (UID: "d445f4fb-f9ad-4896-a9cc-3a1b37e3638b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:02.156573 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.156488 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26qtg\" (UniqueName: \"kubernetes.io/projected/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kube-api-access-26qtg\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:02.156573 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.156515 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:02.156573 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.156526 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-dshm\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:02.156573 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.156534 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-home\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:02.156573 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.156542 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:02.156573 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.156551 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b-model-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:02.686482 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.686449 2571 generic.go:358] "Generic (PLEG): container finished" podID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerID="7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03" exitCode=0 Apr 16 14:14:02.686899 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.686529 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" event={"ID":"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2","Type":"ContainerDied","Data":"7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03"} Apr 16 14:14:02.687912 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.687890 2571 generic.go:358] "Generic (PLEG): container finished" podID="d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" containerID="4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167" exitCode=0 Apr 16 14:14:02.688019 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.687979 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" event={"ID":"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b","Type":"ContainerDied","Data":"4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167"} Apr 16 14:14:02.688019 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.688013 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" event={"ID":"d445f4fb-f9ad-4896-a9cc-3a1b37e3638b","Type":"ContainerDied","Data":"060b9452e97b2f810829f8661bc8b66a461a60462934835b3e415a6cc5b34a38"} Apr 16 14:14:02.688101 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.688030 2571 scope.go:117] "RemoveContainer" containerID="4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167" Apr 16 14:14:02.688101 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.687984 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c" Apr 16 14:14:02.696261 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.696240 2571 scope.go:117] "RemoveContainer" containerID="36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265" Apr 16 14:14:02.712667 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.712608 2571 scope.go:117] "RemoveContainer" containerID="4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167" Apr 16 14:14:02.713064 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:02.713029 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167\": container with ID starting with 4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167 not found: ID does not exist" containerID="4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167" Apr 16 14:14:02.713360 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.713313 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167"} err="failed to get container status \"4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167\": rpc error: code = NotFound desc = could not find container \"4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167\": container with ID starting with 4059ea45e09e3c1dedac96974a2e7a874d356a5064b3e81b712d604cf8bd1167 not found: ID does not exist" Apr 16 14:14:02.713360 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.713353 2571 scope.go:117] "RemoveContainer" containerID="36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265" Apr 16 14:14:02.713522 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.713401 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c"] Apr 16 14:14:02.713742 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:02.713718 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265\": container with ID starting with 36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265 not found: ID does not exist" containerID="36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265" Apr 16 14:14:02.713818 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.713751 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265"} err="failed to get container status \"36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265\": rpc error: code = NotFound desc = could not find container \"36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265\": container with ID starting with 36b05d778e9ae4349d3799d00e04b3177447304f4a2dc54199e605c2bb191265 not found: ID does not exist" Apr 16 14:14:02.719493 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:02.719466 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76597456c7-j8s9c"] Apr 16 14:14:03.103811 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.103787 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:14:03.264775 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.264678 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kserve-provision-location\") pod \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " Apr 16 14:14:03.264775 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.264735 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-cache\") pod \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " Apr 16 14:14:03.264775 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.264760 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-uds\") pod \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " Apr 16 14:14:03.265054 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.264782 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8jbp\" (UniqueName: \"kubernetes.io/projected/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kube-api-access-j8jbp\") pod \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " Apr 16 14:14:03.265054 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.264819 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-tmp\") pod \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " Apr 16 14:14:03.265054 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.264865 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tls-certs\") pod \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\" (UID: \"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2\") " Apr 16 14:14:03.265054 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.265039 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" (UID: "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:03.265280 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.265052 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" (UID: "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:03.265280 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.265201 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" (UID: "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:03.265640 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.265614 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" (UID: "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:03.267054 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.267029 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kube-api-access-j8jbp" (OuterVolumeSpecName: "kube-api-access-j8jbp") pod "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" (UID: "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2"). InnerVolumeSpecName "kube-api-access-j8jbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:14:03.267184 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.267102 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" (UID: "d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:14:03.365687 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.365641 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:03.365687 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.365680 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-uds\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:03.365687 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.365690 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8jbp\" (UniqueName: \"kubernetes.io/projected/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kube-api-access-j8jbp\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:03.365927 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.365700 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tokenizer-tmp\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:03.365927 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.365710 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:03.365927 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.365719 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:03.692789 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.692749 2571 generic.go:358] "Generic (PLEG): container finished" podID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerID="2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc" exitCode=0 Apr 16 14:14:03.693277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.692829 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" event={"ID":"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2","Type":"ContainerDied","Data":"2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc"} Apr 16 14:14:03.693277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.692861 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" Apr 16 14:14:03.693277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.692891 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8" event={"ID":"d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2","Type":"ContainerDied","Data":"2c1e3c9f58fa48a5b6984a51005e72c599c21e72dd438037c1e2d3a5f3be5624"} Apr 16 14:14:03.693277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.692912 2571 scope.go:117] "RemoveContainer" containerID="2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc" Apr 16 14:14:03.701940 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.701919 2571 scope.go:117] "RemoveContainer" containerID="7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03" Apr 16 14:14:03.709644 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.709622 2571 scope.go:117] "RemoveContainer" containerID="55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2" Apr 16 14:14:03.716717 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.716684 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8"] Apr 16 14:14:03.717816 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.717799 2571 scope.go:117] "RemoveContainer" containerID="2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc" Apr 16 14:14:03.718244 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:03.718215 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc\": container with ID starting with 2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc not found: ID does not exist" containerID="2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc" Apr 16 14:14:03.718408 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.718256 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc"} err="failed to get container status \"2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc\": rpc error: code = NotFound desc = could not find container \"2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc\": container with ID starting with 2f0a4ed557020664e0c84a3baa9a454954bea532f6cd0b0df51a69e2b89417fc not found: ID does not exist" Apr 16 14:14:03.718408 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.718287 2571 scope.go:117] "RemoveContainer" containerID="7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03" Apr 16 14:14:03.719091 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:03.719041 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03\": container with ID starting with 7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03 not found: ID does not exist" containerID="7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03" Apr 16 14:14:03.719187 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.719098 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03"} err="failed to get container status \"7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03\": rpc error: code = NotFound desc = could not find container \"7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03\": container with ID starting with 7ef712c5b8ba772d787ba8a1407cb42c6cd6c8e3d60c6f4fe8878d949c5f3e03 not found: ID does not exist" Apr 16 14:14:03.719187 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.719125 2571 scope.go:117] "RemoveContainer" containerID="55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2" Apr 16 14:14:03.719455 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:03.719432 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2\": container with ID starting with 55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2 not found: ID does not exist" containerID="55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2" Apr 16 14:14:03.719550 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.719460 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2"} err="failed to get container status \"55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2\": rpc error: code = NotFound desc = could not find container \"55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2\": container with ID starting with 55d7aa8dc1e27456637a898f5295ecbef8aed72c85346b0fec4dad8579c107c2 not found: ID does not exist" Apr 16 14:14:03.720350 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.720329 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bqvsz8"] Apr 16 14:14:03.955577 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.955497 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" path="/var/lib/kubelet/pods/d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2/volumes" Apr 16 14:14:03.955956 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:03.955942 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" path="/var/lib/kubelet/pods/d445f4fb-f9ad-4896-a9cc-3a1b37e3638b/volumes" Apr 16 14:14:11.720065 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720027 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt"] Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720423 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" containerName="main" Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720440 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" containerName="main" Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720469 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="tokenizer" Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720478 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="tokenizer" Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720493 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="storage-initializer" Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720503 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="storage-initializer" Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720517 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="main" Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720525 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="main" Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720535 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" containerName="storage-initializer" Apr 16 14:14:11.720599 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720543 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" containerName="storage-initializer" Apr 16 14:14:11.720960 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720606 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="tokenizer" Apr 16 14:14:11.720960 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720617 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d29ea6cb-7bb5-42a9-86bd-5342fd88ccf2" containerName="main" Apr 16 14:14:11.720960 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.720631 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d445f4fb-f9ad-4896-a9cc-3a1b37e3638b" containerName="main" Apr 16 14:14:11.722987 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.722966 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.726411 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.726381 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:14:11.727433 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.727411 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:14:11.727566 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.727450 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 14:14:11.727566 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.727457 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:14:11.733734 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.733707 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt"] Apr 16 14:14:11.839237 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.839192 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.839419 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.839258 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-model-cache\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.839419 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.839308 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64swj\" (UniqueName: \"kubernetes.io/projected/f52c217c-9cc3-4118-ada1-ccf646f162e2-kube-api-access-64swj\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.839419 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.839338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f52c217c-9cc3-4118-ada1-ccf646f162e2-tls-certs\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.839419 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.839383 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-dshm\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.839419 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.839413 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-home\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.940500 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.940460 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64swj\" (UniqueName: \"kubernetes.io/projected/f52c217c-9cc3-4118-ada1-ccf646f162e2-kube-api-access-64swj\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.940686 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.940511 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f52c217c-9cc3-4118-ada1-ccf646f162e2-tls-certs\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.940686 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.940550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-dshm\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.940686 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.940582 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-home\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.940686 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.940626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.940686 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.940667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-model-cache\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.941027 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.941004 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-home\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.941133 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.941037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-model-cache\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.941133 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.941112 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.943173 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.943150 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-dshm\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.943309 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.943218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f52c217c-9cc3-4118-ada1-ccf646f162e2-tls-certs\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:11.949259 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:11.949218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64swj\" (UniqueName: \"kubernetes.io/projected/f52c217c-9cc3-4118-ada1-ccf646f162e2-kube-api-access-64swj\") pod \"precise-prefix-cache-test-kserve-64b95fbc77-h6qtt\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:12.034426 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.034328 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:12.166022 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.165981 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt"] Apr 16 14:14:12.169560 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:14:12.169529 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf52c217c_9cc3_4118_ada1_ccf646f162e2.slice/crio-0402d2158b32e8d93b09bc13694046ec08819e865381d680971718a40076d27b WatchSource:0}: Error finding container 0402d2158b32e8d93b09bc13694046ec08819e865381d680971718a40076d27b: Status 404 returned error can't find the container with id 0402d2158b32e8d93b09bc13694046ec08819e865381d680971718a40076d27b Apr 16 14:14:12.232531 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.232478 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g"] Apr 16 14:14:12.235023 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.235002 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.237645 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.237618 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-cszwl\"" Apr 16 14:14:12.243137 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.243100 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.243308 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.243160 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdjz\" (UniqueName: \"kubernetes.io/projected/65a5692e-6730-477e-8d9f-6dcf7318a5be-kube-api-access-tzdjz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.243308 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.243242 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.243308 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.243273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.243485 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.243367 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65a5692e-6730-477e-8d9f-6dcf7318a5be-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.243485 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.243440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.248723 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.248695 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g"] Apr 16 14:14:12.344088 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.343931 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65a5692e-6730-477e-8d9f-6dcf7318a5be-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.344341 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.344282 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.344521 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.344459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.344592 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.344516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdjz\" (UniqueName: \"kubernetes.io/projected/65a5692e-6730-477e-8d9f-6dcf7318a5be-kube-api-access-tzdjz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.344663 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.344627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.344756 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.344726 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.344927 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.344777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.344927 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.344832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.345172 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.345144 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.345378 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.345358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.347682 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.347653 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65a5692e-6730-477e-8d9f-6dcf7318a5be-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.357424 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.357386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdjz\" (UniqueName: \"kubernetes.io/projected/65a5692e-6730-477e-8d9f-6dcf7318a5be-kube-api-access-tzdjz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.548151 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.548111 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:12.691470 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.691411 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g"] Apr 16 14:14:12.696535 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:14:12.696497 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65a5692e_6730_477e_8d9f_6dcf7318a5be.slice/crio-e89ac1a06836e16a144f495c0b902928e15c0de80e2db7c842b404b65f9952fa WatchSource:0}: Error finding container e89ac1a06836e16a144f495c0b902928e15c0de80e2db7c842b404b65f9952fa: Status 404 returned error can't find the container with id e89ac1a06836e16a144f495c0b902928e15c0de80e2db7c842b404b65f9952fa Apr 16 14:14:12.728362 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.728316 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" event={"ID":"f52c217c-9cc3-4118-ada1-ccf646f162e2","Type":"ContainerStarted","Data":"715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0"} Apr 16 14:14:12.728812 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.728373 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" event={"ID":"f52c217c-9cc3-4118-ada1-ccf646f162e2","Type":"ContainerStarted","Data":"0402d2158b32e8d93b09bc13694046ec08819e865381d680971718a40076d27b"} Apr 16 14:14:12.729557 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:12.729525 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" event={"ID":"65a5692e-6730-477e-8d9f-6dcf7318a5be","Type":"ContainerStarted","Data":"e89ac1a06836e16a144f495c0b902928e15c0de80e2db7c842b404b65f9952fa"} Apr 16 14:14:13.734907 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:13.734874 2571 generic.go:358] "Generic (PLEG): container finished" podID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerID="e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86" exitCode=0 Apr 16 14:14:13.735296 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:13.734971 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" event={"ID":"65a5692e-6730-477e-8d9f-6dcf7318a5be","Type":"ContainerDied","Data":"e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86"} Apr 16 14:14:13.911088 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:13.911033 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:14:13.912020 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:13.911994 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:14:14.741409 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:14.741316 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" event={"ID":"65a5692e-6730-477e-8d9f-6dcf7318a5be","Type":"ContainerStarted","Data":"1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0"} Apr 16 14:14:14.742397 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:14.741418 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" event={"ID":"65a5692e-6730-477e-8d9f-6dcf7318a5be","Type":"ContainerStarted","Data":"e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01"} Apr 16 14:14:14.742397 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:14.741546 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:14.763949 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:14.763883 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" podStartSLOduration=2.763863265 podStartE2EDuration="2.763863265s" podCreationTimestamp="2026-04-16 14:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:14:14.762930304 +0000 UTC m=+901.392298583" watchObservedRunningTime="2026-04-16 14:14:14.763863265 +0000 UTC m=+901.393231541" Apr 16 14:14:16.752089 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:16.752034 2571 generic.go:358] "Generic (PLEG): container finished" podID="f52c217c-9cc3-4118-ada1-ccf646f162e2" containerID="715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0" exitCode=0 Apr 16 14:14:16.752560 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:16.752106 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" event={"ID":"f52c217c-9cc3-4118-ada1-ccf646f162e2","Type":"ContainerDied","Data":"715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0"} Apr 16 14:14:17.757106 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:17.757061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" event={"ID":"f52c217c-9cc3-4118-ada1-ccf646f162e2","Type":"ContainerStarted","Data":"bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2"} Apr 16 14:14:17.776830 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:17.776781 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" podStartSLOduration=6.776762728 podStartE2EDuration="6.776762728s" podCreationTimestamp="2026-04-16 14:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:14:17.774609508 +0000 UTC m=+904.403977824" watchObservedRunningTime="2026-04-16 14:14:17.776762728 +0000 UTC m=+904.406131088" Apr 16 14:14:22.035483 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:22.035423 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:22.035906 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:22.035511 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:22.048210 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:22.048177 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:22.548751 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:22.548695 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:22.548751 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:22.548757 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:22.550110 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:14:22.550045 2571 logging.go:55] [core] [Channel #50 SubChannel #51]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.29:9003", ServerName: "10.132.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.29:9003: connect: connection refused" Apr 16 14:14:22.551426 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:22.551405 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:22.776741 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:22.776712 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:22.786938 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:22.786904 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:23.549740 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:23.549679 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.29:9003\" within 1s: context deadline exceeded" Apr 16 14:14:32.549609 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:14:32.549575 2571 logging.go:55] [core] [Channel #52 SubChannel #53]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.29:9003", ServerName: "10.132.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.29:9003: connect: connection refused" Apr 16 14:14:33.549290 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:33.549237 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.29:9003\" within 1s: context deadline exceeded" Apr 16 14:14:43.779750 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:43.779718 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:44.844021 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:44.843306 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt"] Apr 16 14:14:44.844021 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:44.843748 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" podUID="f52c217c-9cc3-4118-ada1-ccf646f162e2" containerName="main" containerID="cri-o://bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2" gracePeriod=30 Apr 16 14:14:44.846611 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:44.846574 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g"] Apr 16 14:14:44.847386 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:44.847321 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="tokenizer" containerID="cri-o://1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0" gracePeriod=30 Apr 16 14:14:44.848338 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:44.848284 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="main" containerID="cri-o://e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01" gracePeriod=30 Apr 16 14:14:45.108461 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.108386 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:45.227214 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227159 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-model-cache\") pod \"f52c217c-9cc3-4118-ada1-ccf646f162e2\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " Apr 16 14:14:45.227214 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227222 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-dshm\") pod \"f52c217c-9cc3-4118-ada1-ccf646f162e2\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " Apr 16 14:14:45.227480 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227242 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-home\") pod \"f52c217c-9cc3-4118-ada1-ccf646f162e2\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " Apr 16 14:14:45.227480 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227297 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-kserve-provision-location\") pod \"f52c217c-9cc3-4118-ada1-ccf646f162e2\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " Apr 16 14:14:45.227480 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227326 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64swj\" (UniqueName: \"kubernetes.io/projected/f52c217c-9cc3-4118-ada1-ccf646f162e2-kube-api-access-64swj\") pod \"f52c217c-9cc3-4118-ada1-ccf646f162e2\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " Apr 16 14:14:45.227480 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227376 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f52c217c-9cc3-4118-ada1-ccf646f162e2-tls-certs\") pod \"f52c217c-9cc3-4118-ada1-ccf646f162e2\" (UID: \"f52c217c-9cc3-4118-ada1-ccf646f162e2\") " Apr 16 14:14:45.227675 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227507 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-model-cache" (OuterVolumeSpecName: "model-cache") pod "f52c217c-9cc3-4118-ada1-ccf646f162e2" (UID: "f52c217c-9cc3-4118-ada1-ccf646f162e2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:45.227675 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227533 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-home" (OuterVolumeSpecName: "home") pod "f52c217c-9cc3-4118-ada1-ccf646f162e2" (UID: "f52c217c-9cc3-4118-ada1-ccf646f162e2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:45.227675 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227648 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-model-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.227675 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.227668 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-home\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.229554 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.229522 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52c217c-9cc3-4118-ada1-ccf646f162e2-kube-api-access-64swj" (OuterVolumeSpecName: "kube-api-access-64swj") pod "f52c217c-9cc3-4118-ada1-ccf646f162e2" (UID: "f52c217c-9cc3-4118-ada1-ccf646f162e2"). InnerVolumeSpecName "kube-api-access-64swj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:14:45.229875 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.229851 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-dshm" (OuterVolumeSpecName: "dshm") pod "f52c217c-9cc3-4118-ada1-ccf646f162e2" (UID: "f52c217c-9cc3-4118-ada1-ccf646f162e2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:45.229954 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.229910 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52c217c-9cc3-4118-ada1-ccf646f162e2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f52c217c-9cc3-4118-ada1-ccf646f162e2" (UID: "f52c217c-9cc3-4118-ada1-ccf646f162e2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:14:45.288377 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.288323 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f52c217c-9cc3-4118-ada1-ccf646f162e2" (UID: "f52c217c-9cc3-4118-ada1-ccf646f162e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:45.328511 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.328464 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.328511 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.328512 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-64swj\" (UniqueName: \"kubernetes.io/projected/f52c217c-9cc3-4118-ada1-ccf646f162e2-kube-api-access-64swj\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.328700 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.328555 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f52c217c-9cc3-4118-ada1-ccf646f162e2-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.328700 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.328572 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f52c217c-9cc3-4118-ada1-ccf646f162e2-dshm\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:45.854247 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.854210 2571 generic.go:358] "Generic (PLEG): container finished" podID="f52c217c-9cc3-4118-ada1-ccf646f162e2" containerID="bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2" exitCode=0 Apr 16 14:14:45.854716 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.854295 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" Apr 16 14:14:45.854716 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.854294 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" event={"ID":"f52c217c-9cc3-4118-ada1-ccf646f162e2","Type":"ContainerDied","Data":"bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2"} Apr 16 14:14:45.854716 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.854338 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt" event={"ID":"f52c217c-9cc3-4118-ada1-ccf646f162e2","Type":"ContainerDied","Data":"0402d2158b32e8d93b09bc13694046ec08819e865381d680971718a40076d27b"} Apr 16 14:14:45.854716 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.854361 2571 scope.go:117] "RemoveContainer" containerID="bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2" Apr 16 14:14:45.856659 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.856626 2571 generic.go:358] "Generic (PLEG): container finished" podID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerID="e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01" exitCode=0 Apr 16 14:14:45.856763 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.856700 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" event={"ID":"65a5692e-6730-477e-8d9f-6dcf7318a5be","Type":"ContainerDied","Data":"e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01"} Apr 16 14:14:45.864144 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.863718 2571 scope.go:117] "RemoveContainer" containerID="715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0" Apr 16 14:14:45.874394 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.874372 2571 scope.go:117] "RemoveContainer" containerID="bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2" Apr 16 14:14:45.874902 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:45.874842 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2\": container with ID starting with bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2 not found: ID does not exist" containerID="bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2" Apr 16 14:14:45.874902 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.874887 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2"} err="failed to get container status \"bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2\": rpc error: code = NotFound desc = could not find container \"bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2\": container with ID starting with bf34f8e037a3d71e8fe48293d95a556cb3bb5703534fe7be7cfd2d9c3b095fb2 not found: ID does not exist" Apr 16 14:14:45.875153 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.874915 2571 scope.go:117] "RemoveContainer" containerID="715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0" Apr 16 14:14:45.875369 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:45.875270 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0\": container with ID starting with 715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0 not found: ID does not exist" containerID="715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0" Apr 16 14:14:45.875369 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.875310 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0"} err="failed to get container status \"715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0\": rpc error: code = NotFound desc = could not find container \"715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0\": container with ID starting with 715caba2eeb0b9bbbeca53f94fbbe1511574cf564ba1f6ea1d80f1a8cae3c5f0 not found: ID does not exist" Apr 16 14:14:45.877144 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.877120 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt"] Apr 16 14:14:45.880691 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.880666 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-64b95fbc77-h6qtt"] Apr 16 14:14:45.956089 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:45.956048 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52c217c-9cc3-4118-ada1-ccf646f162e2" path="/var/lib/kubelet/pods/f52c217c-9cc3-4118-ada1-ccf646f162e2/volumes" Apr 16 14:14:46.602662 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.602636 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:46.742065 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.741961 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65a5692e-6730-477e-8d9f-6dcf7318a5be-tls-certs\") pod \"65a5692e-6730-477e-8d9f-6dcf7318a5be\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " Apr 16 14:14:46.742065 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.742033 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-tmp\") pod \"65a5692e-6730-477e-8d9f-6dcf7318a5be\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " Apr 16 14:14:46.742065 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.742065 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-kserve-provision-location\") pod \"65a5692e-6730-477e-8d9f-6dcf7318a5be\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " Apr 16 14:14:46.742370 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.742136 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-cache\") pod \"65a5692e-6730-477e-8d9f-6dcf7318a5be\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " Apr 16 14:14:46.742370 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.742166 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzdjz\" (UniqueName: \"kubernetes.io/projected/65a5692e-6730-477e-8d9f-6dcf7318a5be-kube-api-access-tzdjz\") pod \"65a5692e-6730-477e-8d9f-6dcf7318a5be\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " Apr 16 14:14:46.742370 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.742189 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-uds\") pod \"65a5692e-6730-477e-8d9f-6dcf7318a5be\" (UID: \"65a5692e-6730-477e-8d9f-6dcf7318a5be\") " Apr 16 14:14:46.742520 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.742434 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "65a5692e-6730-477e-8d9f-6dcf7318a5be" (UID: "65a5692e-6730-477e-8d9f-6dcf7318a5be"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:46.742663 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.742636 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "65a5692e-6730-477e-8d9f-6dcf7318a5be" (UID: "65a5692e-6730-477e-8d9f-6dcf7318a5be"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:46.742663 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.742648 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "65a5692e-6730-477e-8d9f-6dcf7318a5be" (UID: "65a5692e-6730-477e-8d9f-6dcf7318a5be"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:46.743053 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.743027 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "65a5692e-6730-477e-8d9f-6dcf7318a5be" (UID: "65a5692e-6730-477e-8d9f-6dcf7318a5be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:46.744323 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.744299 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a5692e-6730-477e-8d9f-6dcf7318a5be-kube-api-access-tzdjz" (OuterVolumeSpecName: "kube-api-access-tzdjz") pod "65a5692e-6730-477e-8d9f-6dcf7318a5be" (UID: "65a5692e-6730-477e-8d9f-6dcf7318a5be"). InnerVolumeSpecName "kube-api-access-tzdjz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:14:46.744426 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.744347 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a5692e-6730-477e-8d9f-6dcf7318a5be-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "65a5692e-6730-477e-8d9f-6dcf7318a5be" (UID: "65a5692e-6730-477e-8d9f-6dcf7318a5be"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:14:46.843146 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.843104 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65a5692e-6730-477e-8d9f-6dcf7318a5be-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:46.843146 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.843141 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-tmp\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:46.843146 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.843156 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:46.843375 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.843168 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:46.843375 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.843180 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzdjz\" (UniqueName: \"kubernetes.io/projected/65a5692e-6730-477e-8d9f-6dcf7318a5be-kube-api-access-tzdjz\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:46.843375 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.843190 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/65a5692e-6730-477e-8d9f-6dcf7318a5be-tokenizer-uds\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:14:46.863288 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.863252 2571 generic.go:358] "Generic (PLEG): container finished" podID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerID="1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0" exitCode=0 Apr 16 14:14:46.863702 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.863324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" event={"ID":"65a5692e-6730-477e-8d9f-6dcf7318a5be","Type":"ContainerDied","Data":"1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0"} Apr 16 14:14:46.863702 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.863330 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" Apr 16 14:14:46.863702 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.863360 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g" event={"ID":"65a5692e-6730-477e-8d9f-6dcf7318a5be","Type":"ContainerDied","Data":"e89ac1a06836e16a144f495c0b902928e15c0de80e2db7c842b404b65f9952fa"} Apr 16 14:14:46.863702 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.863377 2571 scope.go:117] "RemoveContainer" containerID="1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0" Apr 16 14:14:46.872038 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.871958 2571 scope.go:117] "RemoveContainer" containerID="e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01" Apr 16 14:14:46.879568 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.879540 2571 scope.go:117] "RemoveContainer" containerID="e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86" Apr 16 14:14:46.886311 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.886275 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g"] Apr 16 14:14:46.889157 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.889135 2571 scope.go:117] "RemoveContainer" containerID="1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0" Apr 16 14:14:46.889832 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.889779 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9bdd955frs52g"] Apr 16 14:14:46.890389 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:46.890361 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0\": container with ID starting with 1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0 not found: ID does not exist" containerID="1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0" Apr 16 14:14:46.890495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.890403 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0"} err="failed to get container status \"1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0\": rpc error: code = NotFound desc = could not find container \"1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0\": container with ID starting with 1b0cec4f9faad123dab5d241101451a5f8304e59b2091fa6daba1947802d99c0 not found: ID does not exist" Apr 16 14:14:46.890495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.890423 2571 scope.go:117] "RemoveContainer" containerID="e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01" Apr 16 14:14:46.890764 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:46.890747 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01\": container with ID starting with e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01 not found: ID does not exist" containerID="e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01" Apr 16 14:14:46.890832 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.890768 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01"} err="failed to get container status \"e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01\": rpc error: code = NotFound desc = could not find container \"e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01\": container with ID starting with e5486fdf9d8e3cd15885b953e5c82d43dc1cf87f740fe18a61d8040ed560fb01 not found: ID does not exist" Apr 16 14:14:46.890832 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.890799 2571 scope.go:117] "RemoveContainer" containerID="e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86" Apr 16 14:14:46.891096 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:14:46.891061 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86\": container with ID starting with e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86 not found: ID does not exist" containerID="e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86" Apr 16 14:14:46.891160 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:46.891103 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86"} err="failed to get container status \"e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86\": rpc error: code = NotFound desc = could not find container \"e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86\": container with ID starting with e17db6a224c8871c831a5f976882091251afeefeca8e027afafe118c807f4e86 not found: ID does not exist" Apr 16 14:14:47.955329 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:47.955286 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" path="/var/lib/kubelet/pods/65a5692e-6730-477e-8d9f-6dcf7318a5be/volumes" Apr 16 14:14:57.034555 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034509 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v"] Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034818 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="tokenizer" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034833 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="tokenizer" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034843 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="storage-initializer" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034849 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="storage-initializer" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034858 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="main" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034863 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="main" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034872 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f52c217c-9cc3-4118-ada1-ccf646f162e2" containerName="storage-initializer" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034880 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52c217c-9cc3-4118-ada1-ccf646f162e2" containerName="storage-initializer" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034890 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f52c217c-9cc3-4118-ada1-ccf646f162e2" containerName="main" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034895 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52c217c-9cc3-4118-ada1-ccf646f162e2" containerName="main" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034942 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="tokenizer" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034955 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="65a5692e-6730-477e-8d9f-6dcf7318a5be" containerName="main" Apr 16 14:14:57.035104 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.034964 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f52c217c-9cc3-4118-ada1-ccf646f162e2" containerName="main" Apr 16 14:14:57.038198 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.038168 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.041217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.041160 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 14:14:57.042313 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.042276 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:14:57.042495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.042325 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:14:57.042495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.042356 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:14:57.042495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.042435 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-5mtnm\"" Apr 16 14:14:57.050825 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.050793 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v"] Apr 16 14:14:57.129654 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.129612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.129654 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.129657 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df7318f9-7376-4975-9c6c-c2ec50d210fc-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.129867 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.129745 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.129867 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.129785 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.129867 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.129817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.129972 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.129883 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zj8v\" (UniqueName: \"kubernetes.io/projected/df7318f9-7376-4975-9c6c-c2ec50d210fc-kube-api-access-8zj8v\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.230892 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.230849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.230892 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.230895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.231163 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.231065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zj8v\" (UniqueName: \"kubernetes.io/projected/df7318f9-7376-4975-9c6c-c2ec50d210fc-kube-api-access-8zj8v\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.231224 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.231194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.231277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.231227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df7318f9-7376-4975-9c6c-c2ec50d210fc-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.231277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.231251 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.231370 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.231280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.231370 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.231309 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.231504 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.231484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.231588 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.231572 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.233583 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.233564 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df7318f9-7376-4975-9c6c-c2ec50d210fc-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.240247 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.240216 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zj8v\" (UniqueName: \"kubernetes.io/projected/df7318f9-7376-4975-9c6c-c2ec50d210fc-kube-api-access-8zj8v\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.350709 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.350609 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:57.481879 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.481841 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v"] Apr 16 14:14:57.485472 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:14:57.485439 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf7318f9_7376_4975_9c6c_c2ec50d210fc.slice/crio-07d0bc523f738b24e73418389c972601a25bf43cc5175ef9424d992b0e27a21e WatchSource:0}: Error finding container 07d0bc523f738b24e73418389c972601a25bf43cc5175ef9424d992b0e27a21e: Status 404 returned error can't find the container with id 07d0bc523f738b24e73418389c972601a25bf43cc5175ef9424d992b0e27a21e Apr 16 14:14:57.906867 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.906818 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" event={"ID":"df7318f9-7376-4975-9c6c-c2ec50d210fc","Type":"ContainerStarted","Data":"082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951"} Apr 16 14:14:57.906867 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:57.906859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" event={"ID":"df7318f9-7376-4975-9c6c-c2ec50d210fc","Type":"ContainerStarted","Data":"07d0bc523f738b24e73418389c972601a25bf43cc5175ef9424d992b0e27a21e"} Apr 16 14:14:58.911918 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:58.911872 2571 generic.go:358] "Generic (PLEG): container finished" podID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerID="082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951" exitCode=0 Apr 16 14:14:58.912313 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:58.911933 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" event={"ID":"df7318f9-7376-4975-9c6c-c2ec50d210fc","Type":"ContainerDied","Data":"082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951"} Apr 16 14:14:59.919500 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:59.919450 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" event={"ID":"df7318f9-7376-4975-9c6c-c2ec50d210fc","Type":"ContainerStarted","Data":"5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552"} Apr 16 14:14:59.919500 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:59.919498 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" event={"ID":"df7318f9-7376-4975-9c6c-c2ec50d210fc","Type":"ContainerStarted","Data":"2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c"} Apr 16 14:14:59.920023 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:59.919711 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:14:59.945650 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:14:59.945578 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" podStartSLOduration=2.9455551570000003 podStartE2EDuration="2.945555157s" podCreationTimestamp="2026-04-16 14:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:14:59.943591783 +0000 UTC m=+946.572960064" watchObservedRunningTime="2026-04-16 14:14:59.945555157 +0000 UTC m=+946.574923436" Apr 16 14:15:07.351768 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:15:07.351712 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:15:07.352344 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:15:07.351815 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:15:07.354613 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:15:07.354585 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:15:07.955181 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:15:07.955152 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:15:29.959976 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:15:29.959947 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:16:50.393999 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:50.393960 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v"] Apr 16 14:16:50.394617 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:50.394401 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="main" containerID="cri-o://2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c" gracePeriod=30 Apr 16 14:16:50.394617 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:50.394456 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="tokenizer" containerID="cri-o://5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552" gracePeriod=30 Apr 16 14:16:51.298788 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.298753 2571 generic.go:358] "Generic (PLEG): container finished" podID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerID="2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c" exitCode=0 Apr 16 14:16:51.298967 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.298796 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" event={"ID":"df7318f9-7376-4975-9c6c-c2ec50d210fc","Type":"ContainerDied","Data":"2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c"} Apr 16 14:16:51.845701 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.845673 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:16:51.934323 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.934225 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zj8v\" (UniqueName: \"kubernetes.io/projected/df7318f9-7376-4975-9c6c-c2ec50d210fc-kube-api-access-8zj8v\") pod \"df7318f9-7376-4975-9c6c-c2ec50d210fc\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " Apr 16 14:16:51.934323 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.934273 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-tmp\") pod \"df7318f9-7376-4975-9c6c-c2ec50d210fc\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " Apr 16 14:16:51.934323 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.934294 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df7318f9-7376-4975-9c6c-c2ec50d210fc-tls-certs\") pod \"df7318f9-7376-4975-9c6c-c2ec50d210fc\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " Apr 16 14:16:51.934323 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.934320 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-uds\") pod \"df7318f9-7376-4975-9c6c-c2ec50d210fc\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " Apr 16 14:16:51.934657 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.934345 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-kserve-provision-location\") pod \"df7318f9-7376-4975-9c6c-c2ec50d210fc\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " Apr 16 14:16:51.934657 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.934393 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-cache\") pod \"df7318f9-7376-4975-9c6c-c2ec50d210fc\" (UID: \"df7318f9-7376-4975-9c6c-c2ec50d210fc\") " Apr 16 14:16:51.934657 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.934617 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "df7318f9-7376-4975-9c6c-c2ec50d210fc" (UID: "df7318f9-7376-4975-9c6c-c2ec50d210fc"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:51.934796 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.934655 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "df7318f9-7376-4975-9c6c-c2ec50d210fc" (UID: "df7318f9-7376-4975-9c6c-c2ec50d210fc"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:51.934796 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.934726 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "df7318f9-7376-4975-9c6c-c2ec50d210fc" (UID: "df7318f9-7376-4975-9c6c-c2ec50d210fc"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:51.935137 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.935113 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df7318f9-7376-4975-9c6c-c2ec50d210fc" (UID: "df7318f9-7376-4975-9c6c-c2ec50d210fc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:51.936450 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.936429 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7318f9-7376-4975-9c6c-c2ec50d210fc-kube-api-access-8zj8v" (OuterVolumeSpecName: "kube-api-access-8zj8v") pod "df7318f9-7376-4975-9c6c-c2ec50d210fc" (UID: "df7318f9-7376-4975-9c6c-c2ec50d210fc"). InnerVolumeSpecName "kube-api-access-8zj8v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:16:51.936522 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:51.936446 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7318f9-7376-4975-9c6c-c2ec50d210fc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "df7318f9-7376-4975-9c6c-c2ec50d210fc" (UID: "df7318f9-7376-4975-9c6c-c2ec50d210fc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:16:52.034986 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.034942 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-uds\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:16:52.034986 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.034980 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:16:52.034986 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.034995 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:16:52.035327 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.035013 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8zj8v\" (UniqueName: \"kubernetes.io/projected/df7318f9-7376-4975-9c6c-c2ec50d210fc-kube-api-access-8zj8v\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:16:52.035327 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.035026 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df7318f9-7376-4975-9c6c-c2ec50d210fc-tokenizer-tmp\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:16:52.035327 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.035040 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df7318f9-7376-4975-9c6c-c2ec50d210fc-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:16:52.303777 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.303687 2571 generic.go:358] "Generic (PLEG): container finished" podID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerID="5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552" exitCode=0 Apr 16 14:16:52.303777 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.303760 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" Apr 16 14:16:52.303960 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.303756 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" event={"ID":"df7318f9-7376-4975-9c6c-c2ec50d210fc","Type":"ContainerDied","Data":"5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552"} Apr 16 14:16:52.303960 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.303866 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v" event={"ID":"df7318f9-7376-4975-9c6c-c2ec50d210fc","Type":"ContainerDied","Data":"07d0bc523f738b24e73418389c972601a25bf43cc5175ef9424d992b0e27a21e"} Apr 16 14:16:52.303960 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.303885 2571 scope.go:117] "RemoveContainer" containerID="5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552" Apr 16 14:16:52.312192 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.312172 2571 scope.go:117] "RemoveContainer" containerID="2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c" Apr 16 14:16:52.320304 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.320280 2571 scope.go:117] "RemoveContainer" containerID="082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951" Apr 16 14:16:52.324282 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.324249 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v"] Apr 16 14:16:52.328956 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.328877 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-wf69v"] Apr 16 14:16:52.329027 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.328996 2571 scope.go:117] "RemoveContainer" containerID="5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552" Apr 16 14:16:52.329350 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:16:52.329330 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552\": container with ID starting with 5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552 not found: ID does not exist" containerID="5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552" Apr 16 14:16:52.329459 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.329364 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552"} err="failed to get container status \"5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552\": rpc error: code = NotFound desc = could not find container \"5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552\": container with ID starting with 5e545b6c130ab1082335575071dd60458333e7e3dc952b07a0a674f2e50ec552 not found: ID does not exist" Apr 16 14:16:52.329459 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.329391 2571 scope.go:117] "RemoveContainer" containerID="2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c" Apr 16 14:16:52.329692 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:16:52.329673 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c\": container with ID starting with 2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c not found: ID does not exist" containerID="2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c" Apr 16 14:16:52.329731 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.329700 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c"} err="failed to get container status \"2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c\": rpc error: code = NotFound desc = could not find container \"2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c\": container with ID starting with 2f13f3b76b11d001dc25836376cb453e5f35d06046c194a91b4d96059e257a3c not found: ID does not exist" Apr 16 14:16:52.329731 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.329716 2571 scope.go:117] "RemoveContainer" containerID="082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951" Apr 16 14:16:52.329914 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:16:52.329893 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951\": container with ID starting with 082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951 not found: ID does not exist" containerID="082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951" Apr 16 14:16:52.329975 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:52.329921 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951"} err="failed to get container status \"082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951\": rpc error: code = NotFound desc = could not find container \"082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951\": container with ID starting with 082793d6bc5712da000674937537d2cc6bd7287a3eb59884f6c32c28d505b951 not found: ID does not exist" Apr 16 14:16:53.955312 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:53.955280 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" path="/var/lib/kubelet/pods/df7318f9-7376-4975-9c6c-c2ec50d210fc/volumes" Apr 16 14:16:58.318917 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.318879 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms"] Apr 16 14:16:58.319592 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.319567 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="storage-initializer" Apr 16 14:16:58.319592 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.319594 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="storage-initializer" Apr 16 14:16:58.319732 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.319613 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="tokenizer" Apr 16 14:16:58.319732 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.319622 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="tokenizer" Apr 16 14:16:58.319732 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.319643 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="main" Apr 16 14:16:58.319732 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.319652 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="main" Apr 16 14:16:58.319732 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.319721 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="tokenizer" Apr 16 14:16:58.319965 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.319735 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="df7318f9-7376-4975-9c6c-c2ec50d210fc" containerName="main" Apr 16 14:16:58.324683 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.324655 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.328671 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.328639 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:16:58.328671 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.328656 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:16:58.328902 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.328681 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:16:58.328902 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.328645 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-hkbsk\"" Apr 16 14:16:58.328902 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.328659 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 14:16:58.333528 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.333500 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms"] Apr 16 14:16:58.388387 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.388352 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.388540 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.388393 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.388540 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.388419 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.388540 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.388493 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.388540 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.388530 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.388690 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.388623 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67zjl\" (UniqueName: \"kubernetes.io/projected/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kube-api-access-67zjl\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.489616 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.489579 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67zjl\" (UniqueName: \"kubernetes.io/projected/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kube-api-access-67zjl\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.489804 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.489625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.489804 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.489649 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.489804 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.489680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.489804 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.489700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.489804 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.489723 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.490095 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.490049 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.490171 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.490142 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.490171 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.490153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.490261 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.490185 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.492306 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.492281 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.497629 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.497598 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67zjl\" (UniqueName: \"kubernetes.io/projected/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kube-api-access-67zjl\") pod \"stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.636048 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.636009 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:16:58.769005 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:58.768967 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms"] Apr 16 14:16:58.772308 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:16:58.772278 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dcc31c8_df7a_438d_8dfe_f5c0b6a29de4.slice/crio-a97641e9f69a076ec68a73a1beb9ea12cd020208be0ef0e1e44711cdaba44536 WatchSource:0}: Error finding container a97641e9f69a076ec68a73a1beb9ea12cd020208be0ef0e1e44711cdaba44536: Status 404 returned error can't find the container with id a97641e9f69a076ec68a73a1beb9ea12cd020208be0ef0e1e44711cdaba44536 Apr 16 14:16:59.330044 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:59.329995 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" event={"ID":"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4","Type":"ContainerStarted","Data":"636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557"} Apr 16 14:16:59.330044 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:16:59.330046 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" event={"ID":"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4","Type":"ContainerStarted","Data":"a97641e9f69a076ec68a73a1beb9ea12cd020208be0ef0e1e44711cdaba44536"} Apr 16 14:17:00.334684 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:00.334645 2571 generic.go:358] "Generic (PLEG): container finished" podID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerID="636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557" exitCode=0 Apr 16 14:17:00.335103 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:00.334741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" event={"ID":"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4","Type":"ContainerDied","Data":"636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557"} Apr 16 14:17:01.341850 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:01.341811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" event={"ID":"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4","Type":"ContainerStarted","Data":"57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988"} Apr 16 14:17:01.341850 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:01.341855 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" event={"ID":"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4","Type":"ContainerStarted","Data":"f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34"} Apr 16 14:17:01.342653 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:01.341914 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:17:01.380406 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:01.380351 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" podStartSLOduration=3.380333157 podStartE2EDuration="3.380333157s" podCreationTimestamp="2026-04-16 14:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:17:01.379833341 +0000 UTC m=+1068.009201620" watchObservedRunningTime="2026-04-16 14:17:01.380333157 +0000 UTC m=+1068.009701460" Apr 16 14:17:08.637222 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:08.637113 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:17:08.637222 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:08.637172 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:17:08.639942 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:08.639914 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:17:09.371500 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:09.371470 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:17:30.376589 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:17:30.376558 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:18:50.514500 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:50.514415 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms"] Apr 16 14:18:50.515049 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:50.514727 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="main" containerID="cri-o://f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34" gracePeriod=30 Apr 16 14:18:50.515049 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:50.514798 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="tokenizer" containerID="cri-o://57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988" gracePeriod=30 Apr 16 14:18:50.715691 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:50.715659 2571 generic.go:358] "Generic (PLEG): container finished" podID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerID="f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34" exitCode=0 Apr 16 14:18:50.715860 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:50.715732 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" event={"ID":"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4","Type":"ContainerDied","Data":"f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34"} Apr 16 14:18:51.583615 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.583577 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-647dc49bd9-5g4kh"] Apr 16 14:18:51.586832 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.586811 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:18:51.594502 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.594468 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-647dc49bd9-5g4kh"] Apr 16 14:18:51.707244 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.703170 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf7ddcf-facf-4b2b-a412-430765f198d3-cert\") pod \"llmisvc-controller-manager-647dc49bd9-5g4kh\" (UID: \"aaf7ddcf-facf-4b2b-a412-430765f198d3\") " pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:18:51.707244 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.703259 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr8pd\" (UniqueName: \"kubernetes.io/projected/aaf7ddcf-facf-4b2b-a412-430765f198d3-kube-api-access-gr8pd\") pod \"llmisvc-controller-manager-647dc49bd9-5g4kh\" (UID: \"aaf7ddcf-facf-4b2b-a412-430765f198d3\") " pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:18:51.804712 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.804667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf7ddcf-facf-4b2b-a412-430765f198d3-cert\") pod \"llmisvc-controller-manager-647dc49bd9-5g4kh\" (UID: \"aaf7ddcf-facf-4b2b-a412-430765f198d3\") " pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:18:51.804906 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.804718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr8pd\" (UniqueName: \"kubernetes.io/projected/aaf7ddcf-facf-4b2b-a412-430765f198d3-kube-api-access-gr8pd\") pod \"llmisvc-controller-manager-647dc49bd9-5g4kh\" (UID: \"aaf7ddcf-facf-4b2b-a412-430765f198d3\") " pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:18:51.807223 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.807193 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf7ddcf-facf-4b2b-a412-430765f198d3-cert\") pod \"llmisvc-controller-manager-647dc49bd9-5g4kh\" (UID: \"aaf7ddcf-facf-4b2b-a412-430765f198d3\") " pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:18:51.813848 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.813819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr8pd\" (UniqueName: \"kubernetes.io/projected/aaf7ddcf-facf-4b2b-a412-430765f198d3-kube-api-access-gr8pd\") pod \"llmisvc-controller-manager-647dc49bd9-5g4kh\" (UID: \"aaf7ddcf-facf-4b2b-a412-430765f198d3\") " pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:18:51.898458 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.898417 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:18:51.959613 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:51.959582 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:18:52.034667 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.034616 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-647dc49bd9-5g4kh"] Apr 16 14:18:52.037748 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:18:52.037708 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaaf7ddcf_facf_4b2b_a412_430765f198d3.slice/crio-fb789053aa2798d3517efa280825630283fe00950f339cf6d9e315f5009bd7e9 WatchSource:0}: Error finding container fb789053aa2798d3517efa280825630283fe00950f339cf6d9e315f5009bd7e9: Status 404 returned error can't find the container with id fb789053aa2798d3517efa280825630283fe00950f339cf6d9e315f5009bd7e9 Apr 16 14:18:52.039143 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.039117 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:18:52.107826 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.107789 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-tmp\") pod \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " Apr 16 14:18:52.108021 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.107852 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-uds\") pod \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " Apr 16 14:18:52.108021 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.107878 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kserve-provision-location\") pod \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " Apr 16 14:18:52.108021 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.107897 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67zjl\" (UniqueName: \"kubernetes.io/projected/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kube-api-access-67zjl\") pod \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " Apr 16 14:18:52.108021 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.107940 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-cache\") pod \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " Apr 16 14:18:52.108021 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.107966 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tls-certs\") pod \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\" (UID: \"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4\") " Apr 16 14:18:52.108319 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.108194 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" (UID: "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:52.108319 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.108221 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" (UID: "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:52.108319 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.108250 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" (UID: "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:52.108689 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.108664 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" (UID: "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:52.110161 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.110138 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kube-api-access-67zjl" (OuterVolumeSpecName: "kube-api-access-67zjl") pod "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" (UID: "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4"). InnerVolumeSpecName "kube-api-access-67zjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:18:52.110249 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.110186 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" (UID: "8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:18:52.208755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.208657 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-tmp\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:18:52.208755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.208690 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-uds\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:18:52.208755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.208702 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:18:52.208755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.208713 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-67zjl\" (UniqueName: \"kubernetes.io/projected/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-kube-api-access-67zjl\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:18:52.208755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.208722 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tokenizer-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:18:52.208755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.208731 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:18:52.724050 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.724002 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" event={"ID":"aaf7ddcf-facf-4b2b-a412-430765f198d3","Type":"ContainerStarted","Data":"b487bae4440bd04a80d16577f041593504de89240e01915b3433a4f22b425993"} Apr 16 14:18:52.724050 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.724049 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" event={"ID":"aaf7ddcf-facf-4b2b-a412-430765f198d3","Type":"ContainerStarted","Data":"fb789053aa2798d3517efa280825630283fe00950f339cf6d9e315f5009bd7e9"} Apr 16 14:18:52.724554 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.724125 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:18:52.725630 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.725603 2571 generic.go:358] "Generic (PLEG): container finished" podID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerID="57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988" exitCode=0 Apr 16 14:18:52.725749 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.725639 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" event={"ID":"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4","Type":"ContainerDied","Data":"57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988"} Apr 16 14:18:52.725749 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.725669 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" event={"ID":"8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4","Type":"ContainerDied","Data":"a97641e9f69a076ec68a73a1beb9ea12cd020208be0ef0e1e44711cdaba44536"} Apr 16 14:18:52.725749 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.725671 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms" Apr 16 14:18:52.725749 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.725686 2571 scope.go:117] "RemoveContainer" containerID="57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988" Apr 16 14:18:52.735163 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.735137 2571 scope.go:117] "RemoveContainer" containerID="f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34" Apr 16 14:18:52.742673 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.742611 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" podStartSLOduration=1.226463292 podStartE2EDuration="1.742594303s" podCreationTimestamp="2026-04-16 14:18:51 +0000 UTC" firstStartedPulling="2026-04-16 14:18:52.039255383 +0000 UTC m=+1178.668623641" lastFinishedPulling="2026-04-16 14:18:52.555386392 +0000 UTC m=+1179.184754652" observedRunningTime="2026-04-16 14:18:52.741397044 +0000 UTC m=+1179.370765321" watchObservedRunningTime="2026-04-16 14:18:52.742594303 +0000 UTC m=+1179.371962597" Apr 16 14:18:52.745841 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.745819 2571 scope.go:117] "RemoveContainer" containerID="636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557" Apr 16 14:18:52.758277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.758241 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms"] Apr 16 14:18:52.762044 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.762012 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6476dcb88-t7fms"] Apr 16 14:18:52.762837 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.762815 2571 scope.go:117] "RemoveContainer" containerID="57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988" Apr 16 14:18:52.763208 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:18:52.763179 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988\": container with ID starting with 57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988 not found: ID does not exist" containerID="57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988" Apr 16 14:18:52.763324 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.763217 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988"} err="failed to get container status \"57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988\": rpc error: code = NotFound desc = could not find container \"57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988\": container with ID starting with 57efdccf92a4f124ff4ae788d66c31d9fc0e8e0970370fe08e7f8ee064e9b988 not found: ID does not exist" Apr 16 14:18:52.763324 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.763239 2571 scope.go:117] "RemoveContainer" containerID="f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34" Apr 16 14:18:52.763514 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:18:52.763496 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34\": container with ID starting with f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34 not found: ID does not exist" containerID="f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34" Apr 16 14:18:52.763553 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.763530 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34"} err="failed to get container status \"f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34\": rpc error: code = NotFound desc = could not find container \"f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34\": container with ID starting with f33032398a6b9f91548b9e07ff5f02e587b54ce8dc18f720226b53ac84546f34 not found: ID does not exist" Apr 16 14:18:52.763553 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.763547 2571 scope.go:117] "RemoveContainer" containerID="636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557" Apr 16 14:18:52.763789 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:18:52.763774 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557\": container with ID starting with 636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557 not found: ID does not exist" containerID="636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557" Apr 16 14:18:52.763833 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:52.763792 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557"} err="failed to get container status \"636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557\": rpc error: code = NotFound desc = could not find container \"636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557\": container with ID starting with 636d72de08c8cd673bb6ee31ed56e80b6bd70173cd43d0afb1a9fd388fc89557 not found: ID does not exist" Apr 16 14:18:53.955637 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:18:53.955602 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" path="/var/lib/kubelet/pods/8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4/volumes" Apr 16 14:19:13.933053 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:13.933019 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:19:13.933480 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:13.933194 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:19:23.732220 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:23.732190 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-647dc49bd9-5g4kh" Apr 16 14:19:23.778570 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:23.778535 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-546c567d96-qn5bg"] Apr 16 14:19:23.778855 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:23.778817 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" podUID="9c2961ae-97dc-4272-8934-95e91e621b8d" containerName="manager" containerID="cri-o://a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93" gracePeriod=30 Apr 16 14:19:23.837825 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:23.837777 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" podUID="9c2961ae-97dc-4272-8934-95e91e621b8d" containerName="manager" probeResult="failure" output="Get \"http://10.132.0.23:8081/readyz\": dial tcp 10.132.0.23:8081: connect: connection refused" Apr 16 14:19:24.027985 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.027952 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:19:24.062935 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.062889 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c2961ae-97dc-4272-8934-95e91e621b8d-cert\") pod \"9c2961ae-97dc-4272-8934-95e91e621b8d\" (UID: \"9c2961ae-97dc-4272-8934-95e91e621b8d\") " Apr 16 14:19:24.062935 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.062930 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r29ww\" (UniqueName: \"kubernetes.io/projected/9c2961ae-97dc-4272-8934-95e91e621b8d-kube-api-access-r29ww\") pod \"9c2961ae-97dc-4272-8934-95e91e621b8d\" (UID: \"9c2961ae-97dc-4272-8934-95e91e621b8d\") " Apr 16 14:19:24.065121 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.065054 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2961ae-97dc-4272-8934-95e91e621b8d-cert" (OuterVolumeSpecName: "cert") pod "9c2961ae-97dc-4272-8934-95e91e621b8d" (UID: "9c2961ae-97dc-4272-8934-95e91e621b8d"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:19:24.065121 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.065102 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2961ae-97dc-4272-8934-95e91e621b8d-kube-api-access-r29ww" (OuterVolumeSpecName: "kube-api-access-r29ww") pod "9c2961ae-97dc-4272-8934-95e91e621b8d" (UID: "9c2961ae-97dc-4272-8934-95e91e621b8d"). InnerVolumeSpecName "kube-api-access-r29ww". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:19:24.163907 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.163862 2571 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c2961ae-97dc-4272-8934-95e91e621b8d-cert\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:19:24.163907 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.163900 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r29ww\" (UniqueName: \"kubernetes.io/projected/9c2961ae-97dc-4272-8934-95e91e621b8d-kube-api-access-r29ww\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:19:24.839926 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.839890 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c2961ae-97dc-4272-8934-95e91e621b8d" containerID="a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93" exitCode=0 Apr 16 14:19:24.840390 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.839932 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" event={"ID":"9c2961ae-97dc-4272-8934-95e91e621b8d","Type":"ContainerDied","Data":"a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93"} Apr 16 14:19:24.840390 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.839956 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" event={"ID":"9c2961ae-97dc-4272-8934-95e91e621b8d","Type":"ContainerDied","Data":"97b33595a032b4f63e9e306ad9d4b46388c9f72233ababda4f497ac6dedb80de"} Apr 16 14:19:24.840390 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.839969 2571 scope.go:117] "RemoveContainer" containerID="a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93" Apr 16 14:19:24.840390 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.839995 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-546c567d96-qn5bg" Apr 16 14:19:24.848962 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.848940 2571 scope.go:117] "RemoveContainer" containerID="a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93" Apr 16 14:19:24.849308 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:19:24.849287 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93\": container with ID starting with a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93 not found: ID does not exist" containerID="a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93" Apr 16 14:19:24.849361 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.849319 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93"} err="failed to get container status \"a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93\": rpc error: code = NotFound desc = could not find container \"a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93\": container with ID starting with a5bcbc1703338b5900ba2fb48dc88a2719b4ec734698fb7ce72d89c065069c93 not found: ID does not exist" Apr 16 14:19:24.861373 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.861338 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-546c567d96-qn5bg"] Apr 16 14:19:24.864465 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:24.864431 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-546c567d96-qn5bg"] Apr 16 14:19:25.955697 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:19:25.955663 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2961ae-97dc-4272-8934-95e91e621b8d" path="/var/lib/kubelet/pods/9c2961ae-97dc-4272-8934-95e91e621b8d/volumes" Apr 16 14:20:46.937918 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.937879 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f"] Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938371 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="storage-initializer" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938392 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="storage-initializer" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938405 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="tokenizer" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938413 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="tokenizer" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938431 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c2961ae-97dc-4272-8934-95e91e621b8d" containerName="manager" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938439 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2961ae-97dc-4272-8934-95e91e621b8d" containerName="manager" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938447 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="main" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938455 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="main" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938528 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="tokenizer" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938540 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8dcc31c8-df7a-438d-8dfe-f5c0b6a29de4" containerName="main" Apr 16 14:20:46.938829 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.938552 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c2961ae-97dc-4272-8934-95e91e621b8d" containerName="manager" Apr 16 14:20:46.941649 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.941628 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:46.944755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.944730 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:20:46.944755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.944751 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-zfwzr\"" Apr 16 14:20:46.946012 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.945991 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:20:46.946012 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.946002 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:20:46.946216 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.946152 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 14:20:46.953797 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.953773 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f"] Apr 16 14:20:46.994730 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.994691 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:46.994730 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.994732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71a289f0-a1ed-4b54-bb04-7045f369a23b-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:46.994947 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.994758 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:46.994947 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.994838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:46.994947 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.994877 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7pk2\" (UniqueName: \"kubernetes.io/projected/71a289f0-a1ed-4b54-bb04-7045f369a23b-kube-api-access-b7pk2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:46.994947 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:46.994930 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.095423 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.095382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.095621 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.095439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.095621 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.095474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71a289f0-a1ed-4b54-bb04-7045f369a23b-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.095621 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.095516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.095621 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.095558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.095621 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.095602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7pk2\" (UniqueName: \"kubernetes.io/projected/71a289f0-a1ed-4b54-bb04-7045f369a23b-kube-api-access-b7pk2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.095886 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.095859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.096011 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.095934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.096099 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.096014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.096146 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.096109 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.098173 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.098149 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71a289f0-a1ed-4b54-bb04-7045f369a23b-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.105975 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.105942 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7pk2\" (UniqueName: \"kubernetes.io/projected/71a289f0-a1ed-4b54-bb04-7045f369a23b-kube-api-access-b7pk2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.251654 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.251554 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:47.391782 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:47.391744 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f"] Apr 16 14:20:47.394461 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:20:47.394420 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71a289f0_a1ed_4b54_bb04_7045f369a23b.slice/crio-a752fdba7fc59bdea340a4974bf48dee4b8bd5bec3c7fc0ab87509d7cbdf1322 WatchSource:0}: Error finding container a752fdba7fc59bdea340a4974bf48dee4b8bd5bec3c7fc0ab87509d7cbdf1322: Status 404 returned error can't find the container with id a752fdba7fc59bdea340a4974bf48dee4b8bd5bec3c7fc0ab87509d7cbdf1322 Apr 16 14:20:48.108426 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:48.108392 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" event={"ID":"71a289f0-a1ed-4b54-bb04-7045f369a23b","Type":"ContainerStarted","Data":"31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a"} Apr 16 14:20:48.108426 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:48.108431 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" event={"ID":"71a289f0-a1ed-4b54-bb04-7045f369a23b","Type":"ContainerStarted","Data":"a752fdba7fc59bdea340a4974bf48dee4b8bd5bec3c7fc0ab87509d7cbdf1322"} Apr 16 14:20:49.112484 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:49.112443 2571 generic.go:358] "Generic (PLEG): container finished" podID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerID="31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a" exitCode=0 Apr 16 14:20:49.112891 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:49.112490 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" event={"ID":"71a289f0-a1ed-4b54-bb04-7045f369a23b","Type":"ContainerDied","Data":"31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a"} Apr 16 14:20:50.117849 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:50.117809 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" event={"ID":"71a289f0-a1ed-4b54-bb04-7045f369a23b","Type":"ContainerStarted","Data":"ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a"} Apr 16 14:20:50.118277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:50.117858 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" event={"ID":"71a289f0-a1ed-4b54-bb04-7045f369a23b","Type":"ContainerStarted","Data":"f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da"} Apr 16 14:20:50.118277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:50.118029 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:50.140661 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:50.140596 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" podStartSLOduration=4.140572669 podStartE2EDuration="4.140572669s" podCreationTimestamp="2026-04-16 14:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:50.138258534 +0000 UTC m=+1296.767626812" watchObservedRunningTime="2026-04-16 14:20:50.140572669 +0000 UTC m=+1296.769940950" Apr 16 14:20:57.252590 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:57.252538 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:57.253125 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:57.252670 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:57.255273 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:57.255248 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:20:58.157471 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:20:58.157439 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:21:20.164585 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:21:20.164554 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:22:14.548804 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.548767 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:22:14.551867 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.551828 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.557590 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.557559 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-x4fdp\"" Apr 16 14:22:14.558170 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.558137 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 14:22:14.567275 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.567243 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:22:14.570345 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.570314 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.570519 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.570361 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51135e-33b7-4c21-a18c-a8035aa004fb-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.570519 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.570444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.570519 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.570490 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.570673 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.570545 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.570673 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.570573 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkb2p\" (UniqueName: \"kubernetes.io/projected/5b51135e-33b7-4c21-a18c-a8035aa004fb-kube-api-access-hkb2p\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.671175 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.671138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.671359 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.671180 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.671359 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.671225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.671359 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.671248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkb2p\" (UniqueName: \"kubernetes.io/projected/5b51135e-33b7-4c21-a18c-a8035aa004fb-kube-api-access-hkb2p\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.671359 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.671284 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.671359 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.671308 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51135e-33b7-4c21-a18c-a8035aa004fb-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.671616 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.671541 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.671616 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.671579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.671616 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.671610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.673537 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.673504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.673746 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.673726 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51135e-33b7-4c21-a18c-a8035aa004fb-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.679867 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.679835 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkb2p\" (UniqueName: \"kubernetes.io/projected/5b51135e-33b7-4c21-a18c-a8035aa004fb-kube-api-access-hkb2p\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.863199 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.863100 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:22:14.995123 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:14.995082 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:22:14.998830 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:22:14.998789 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b51135e_33b7_4c21_a18c_a8035aa004fb.slice/crio-0b5b8c685734e9fc588bf2da5501fac73d56003f79fb69ec7be88bf69df8f2ce WatchSource:0}: Error finding container 0b5b8c685734e9fc588bf2da5501fac73d56003f79fb69ec7be88bf69df8f2ce: Status 404 returned error can't find the container with id 0b5b8c685734e9fc588bf2da5501fac73d56003f79fb69ec7be88bf69df8f2ce Apr 16 14:22:15.410753 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:15.410715 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5b51135e-33b7-4c21-a18c-a8035aa004fb","Type":"ContainerStarted","Data":"14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5"} Apr 16 14:22:15.410753 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:15.410754 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5b51135e-33b7-4c21-a18c-a8035aa004fb","Type":"ContainerStarted","Data":"0b5b8c685734e9fc588bf2da5501fac73d56003f79fb69ec7be88bf69df8f2ce"} Apr 16 14:22:19.427083 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:19.427031 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b51135e-33b7-4c21-a18c-a8035aa004fb" containerID="14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5" exitCode=0 Apr 16 14:22:19.427552 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:22:19.427109 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5b51135e-33b7-4c21-a18c-a8035aa004fb","Type":"ContainerDied","Data":"14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5"} Apr 16 14:23:08.611400 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:08.611364 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5b51135e-33b7-4c21-a18c-a8035aa004fb","Type":"ContainerStarted","Data":"eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5"} Apr 16 14:23:08.632574 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:08.632424 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=5.611023383 podStartE2EDuration="54.632402004s" podCreationTimestamp="2026-04-16 14:22:14 +0000 UTC" firstStartedPulling="2026-04-16 14:22:19.428384909 +0000 UTC m=+1386.057753166" lastFinishedPulling="2026-04-16 14:23:08.449763515 +0000 UTC m=+1435.079131787" observedRunningTime="2026-04-16 14:23:08.630204595 +0000 UTC m=+1435.259572884" watchObservedRunningTime="2026-04-16 14:23:08.632402004 +0000 UTC m=+1435.261770285" Apr 16 14:23:56.830995 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:56.830955 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f"] Apr 16 14:23:56.831564 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:56.831333 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="main" containerID="cri-o://f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da" gracePeriod=30 Apr 16 14:23:56.831564 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:56.831423 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="tokenizer" containerID="cri-o://ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a" gracePeriod=30 Apr 16 14:23:57.791512 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:57.791471 2571 generic.go:358] "Generic (PLEG): container finished" podID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerID="f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da" exitCode=0 Apr 16 14:23:57.791687 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:57.791542 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" event={"ID":"71a289f0-a1ed-4b54-bb04-7045f369a23b","Type":"ContainerDied","Data":"f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da"} Apr 16 14:23:58.156692 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.156653 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.33:8082/healthz\": dial tcp 10.132.0.33:8082: connect: connection refused" Apr 16 14:23:58.283000 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.282974 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:23:58.368799 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.368762 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-kserve-provision-location\") pod \"71a289f0-a1ed-4b54-bb04-7045f369a23b\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " Apr 16 14:23:58.368995 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.368812 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71a289f0-a1ed-4b54-bb04-7045f369a23b-tls-certs\") pod \"71a289f0-a1ed-4b54-bb04-7045f369a23b\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " Apr 16 14:23:58.368995 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.368844 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-uds\") pod \"71a289f0-a1ed-4b54-bb04-7045f369a23b\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " Apr 16 14:23:58.368995 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.368871 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7pk2\" (UniqueName: \"kubernetes.io/projected/71a289f0-a1ed-4b54-bb04-7045f369a23b-kube-api-access-b7pk2\") pod \"71a289f0-a1ed-4b54-bb04-7045f369a23b\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " Apr 16 14:23:58.368995 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.368899 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-cache\") pod \"71a289f0-a1ed-4b54-bb04-7045f369a23b\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " Apr 16 14:23:58.368995 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.368922 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-tmp\") pod \"71a289f0-a1ed-4b54-bb04-7045f369a23b\" (UID: \"71a289f0-a1ed-4b54-bb04-7045f369a23b\") " Apr 16 14:23:58.369295 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.369199 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "71a289f0-a1ed-4b54-bb04-7045f369a23b" (UID: "71a289f0-a1ed-4b54-bb04-7045f369a23b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:58.369483 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.369444 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "71a289f0-a1ed-4b54-bb04-7045f369a23b" (UID: "71a289f0-a1ed-4b54-bb04-7045f369a23b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:58.369483 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.369462 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "71a289f0-a1ed-4b54-bb04-7045f369a23b" (UID: "71a289f0-a1ed-4b54-bb04-7045f369a23b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:58.369790 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.369766 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "71a289f0-a1ed-4b54-bb04-7045f369a23b" (UID: "71a289f0-a1ed-4b54-bb04-7045f369a23b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:58.371189 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.371163 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a289f0-a1ed-4b54-bb04-7045f369a23b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "71a289f0-a1ed-4b54-bb04-7045f369a23b" (UID: "71a289f0-a1ed-4b54-bb04-7045f369a23b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:23:58.371283 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.371200 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a289f0-a1ed-4b54-bb04-7045f369a23b-kube-api-access-b7pk2" (OuterVolumeSpecName: "kube-api-access-b7pk2") pod "71a289f0-a1ed-4b54-bb04-7045f369a23b" (UID: "71a289f0-a1ed-4b54-bb04-7045f369a23b"). InnerVolumeSpecName "kube-api-access-b7pk2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:23:58.470410 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.470376 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:23:58.470410 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.470403 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71a289f0-a1ed-4b54-bb04-7045f369a23b-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:23:58.470410 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.470415 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-uds\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:23:58.470410 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.470426 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b7pk2\" (UniqueName: \"kubernetes.io/projected/71a289f0-a1ed-4b54-bb04-7045f369a23b-kube-api-access-b7pk2\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:23:58.470675 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.470436 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:23:58.470675 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.470444 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71a289f0-a1ed-4b54-bb04-7045f369a23b-tokenizer-tmp\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:23:58.796820 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.796732 2571 generic.go:358] "Generic (PLEG): container finished" podID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerID="ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a" exitCode=0 Apr 16 14:23:58.796820 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.796777 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" event={"ID":"71a289f0-a1ed-4b54-bb04-7045f369a23b","Type":"ContainerDied","Data":"ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a"} Apr 16 14:23:58.796820 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.796799 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" event={"ID":"71a289f0-a1ed-4b54-bb04-7045f369a23b","Type":"ContainerDied","Data":"a752fdba7fc59bdea340a4974bf48dee4b8bd5bec3c7fc0ab87509d7cbdf1322"} Apr 16 14:23:58.796820 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.796814 2571 scope.go:117] "RemoveContainer" containerID="ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a" Apr 16 14:23:58.797117 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.796822 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f" Apr 16 14:23:58.811112 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.811083 2571 scope.go:117] "RemoveContainer" containerID="f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da" Apr 16 14:23:58.818905 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.818881 2571 scope.go:117] "RemoveContainer" containerID="31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a" Apr 16 14:23:58.824495 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.824468 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f"] Apr 16 14:23:58.829055 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.828923 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche62t4f"] Apr 16 14:23:58.829145 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.829132 2571 scope.go:117] "RemoveContainer" containerID="ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a" Apr 16 14:23:58.829486 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:23:58.829457 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a\": container with ID starting with ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a not found: ID does not exist" containerID="ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a" Apr 16 14:23:58.829603 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.829495 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a"} err="failed to get container status \"ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a\": rpc error: code = NotFound desc = could not find container \"ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a\": container with ID starting with ab474a4b490ed7da610266f957ef2019af2ac1c7bb960afd5ae2341a63f7929a not found: ID does not exist" Apr 16 14:23:58.829603 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.829516 2571 scope.go:117] "RemoveContainer" containerID="f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da" Apr 16 14:23:58.829801 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:23:58.829788 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da\": container with ID starting with f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da not found: ID does not exist" containerID="f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da" Apr 16 14:23:58.829848 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.829809 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da"} err="failed to get container status \"f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da\": rpc error: code = NotFound desc = could not find container \"f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da\": container with ID starting with f5c4a4b9898b0fe1563ab6bc194ae42fa2c5ee0aabe8ceb5d3df70f2190f87da not found: ID does not exist" Apr 16 14:23:58.829848 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.829824 2571 scope.go:117] "RemoveContainer" containerID="31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a" Apr 16 14:23:58.830053 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:23:58.830037 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a\": container with ID starting with 31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a not found: ID does not exist" containerID="31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a" Apr 16 14:23:58.830132 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:58.830057 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a"} err="failed to get container status \"31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a\": rpc error: code = NotFound desc = could not find container \"31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a\": container with ID starting with 31a1bb81477599c976e73ecc9dc2d87503e50b9c345d5d13126b38ab2d8ddd1a not found: ID does not exist" Apr 16 14:23:59.955337 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:23:59.955305 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" path="/var/lib/kubelet/pods/71a289f0-a1ed-4b54-bb04-7045f369a23b/volumes" Apr 16 14:24:01.438545 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.438506 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb"] Apr 16 14:24:01.438966 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.438945 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="tokenizer" Apr 16 14:24:01.439012 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.438972 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="tokenizer" Apr 16 14:24:01.439012 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.439001 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="storage-initializer" Apr 16 14:24:01.439012 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.439010 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="storage-initializer" Apr 16 14:24:01.439130 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.439022 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="main" Apr 16 14:24:01.439130 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.439032 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="main" Apr 16 14:24:01.439193 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.439138 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="tokenizer" Apr 16 14:24:01.439193 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.439154 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="71a289f0-a1ed-4b54-bb04-7045f369a23b" containerName="main" Apr 16 14:24:01.444581 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.444554 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.447870 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.447841 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-jzbct\"" Apr 16 14:24:01.448043 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.447849 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 14:24:01.452751 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.452693 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb"] Apr 16 14:24:01.495439 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.495398 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbff47a-e793-45f0-b44a-63850b3bfbcf-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.495660 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.495455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-model-cache\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.495660 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.495503 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-dshm\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.495660 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.495552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.495660 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.495605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chzm\" (UniqueName: \"kubernetes.io/projected/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kube-api-access-6chzm\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.495890 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.495681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-home\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.596798 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.596755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-dshm\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.596969 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.596809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.596969 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.596843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chzm\" (UniqueName: \"kubernetes.io/projected/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kube-api-access-6chzm\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.596969 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.596885 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-home\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.597181 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.596968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbff47a-e793-45f0-b44a-63850b3bfbcf-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.597181 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.597005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-model-cache\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.597279 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.597261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.597339 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.597319 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-home\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.597420 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.597403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-model-cache\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.599613 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.599576 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbff47a-e793-45f0-b44a-63850b3bfbcf-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.599755 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.599718 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-dshm\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.606544 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.606510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chzm\" (UniqueName: \"kubernetes.io/projected/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kube-api-access-6chzm\") pod \"custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.759118 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.758997 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:01.901613 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.901581 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb"] Apr 16 14:24:01.904885 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:24:01.904839 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cbff47a_e793_45f0_b44a_63850b3bfbcf.slice/crio-b9d19c4e27240ecfdaad99d48213b89c6193500c9e5d65e32c6a343507c80ea3 WatchSource:0}: Error finding container b9d19c4e27240ecfdaad99d48213b89c6193500c9e5d65e32c6a343507c80ea3: Status 404 returned error can't find the container with id b9d19c4e27240ecfdaad99d48213b89c6193500c9e5d65e32c6a343507c80ea3 Apr 16 14:24:01.906915 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:01.906896 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:24:02.814369 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:02.814330 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" event={"ID":"0cbff47a-e793-45f0-b44a-63850b3bfbcf","Type":"ContainerStarted","Data":"0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec"} Apr 16 14:24:02.814810 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:02.814378 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" event={"ID":"0cbff47a-e793-45f0-b44a-63850b3bfbcf","Type":"ContainerStarted","Data":"b9d19c4e27240ecfdaad99d48213b89c6193500c9e5d65e32c6a343507c80ea3"} Apr 16 14:24:02.814810 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:02.814416 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:03.820421 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:03.820377 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" event={"ID":"0cbff47a-e793-45f0-b44a-63850b3bfbcf","Type":"ContainerStarted","Data":"de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b"} Apr 16 14:24:07.837323 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:07.837276 2571 generic.go:358] "Generic (PLEG): container finished" podID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerID="de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b" exitCode=0 Apr 16 14:24:07.837744 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:07.837357 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" event={"ID":"0cbff47a-e793-45f0-b44a-63850b3bfbcf","Type":"ContainerDied","Data":"de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b"} Apr 16 14:24:08.843685 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:08.843647 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" event={"ID":"0cbff47a-e793-45f0-b44a-63850b3bfbcf","Type":"ContainerStarted","Data":"f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0"} Apr 16 14:24:08.868993 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:08.868926 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podStartSLOduration=7.038368605 podStartE2EDuration="7.86890427s" podCreationTimestamp="2026-04-16 14:24:01 +0000 UTC" firstStartedPulling="2026-04-16 14:24:01.90703465 +0000 UTC m=+1488.536402906" lastFinishedPulling="2026-04-16 14:24:02.737570315 +0000 UTC m=+1489.366938571" observedRunningTime="2026-04-16 14:24:08.86579795 +0000 UTC m=+1495.495166228" watchObservedRunningTime="2026-04-16 14:24:08.86890427 +0000 UTC m=+1495.498272550" Apr 16 14:24:11.759144 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:11.759097 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:11.759644 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:11.759161 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:11.760685 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:11.760652 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:24:13.961090 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:13.961037 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:24:13.961574 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:13.961383 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:24:21.760410 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:21.760343 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:24:21.776617 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:21.776581 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:24:31.760325 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:31.760271 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:24:41.760487 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:41.760422 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:24:49.324947 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:49.324894 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:24:49.325539 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:49.325230 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="5b51135e-33b7-4c21-a18c-a8035aa004fb" containerName="main" containerID="cri-o://eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5" gracePeriod=30 Apr 16 14:24:50.403465 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.403438 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:24:50.427694 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.426090 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-kserve-provision-location\") pod \"5b51135e-33b7-4c21-a18c-a8035aa004fb\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " Apr 16 14:24:50.427694 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.426159 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-dshm\") pod \"5b51135e-33b7-4c21-a18c-a8035aa004fb\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " Apr 16 14:24:50.427694 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.426193 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-home\") pod \"5b51135e-33b7-4c21-a18c-a8035aa004fb\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " Apr 16 14:24:50.427694 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.426241 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-model-cache\") pod \"5b51135e-33b7-4c21-a18c-a8035aa004fb\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " Apr 16 14:24:50.427694 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.426268 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51135e-33b7-4c21-a18c-a8035aa004fb-tls-certs\") pod \"5b51135e-33b7-4c21-a18c-a8035aa004fb\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " Apr 16 14:24:50.427694 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.426306 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkb2p\" (UniqueName: \"kubernetes.io/projected/5b51135e-33b7-4c21-a18c-a8035aa004fb-kube-api-access-hkb2p\") pod \"5b51135e-33b7-4c21-a18c-a8035aa004fb\" (UID: \"5b51135e-33b7-4c21-a18c-a8035aa004fb\") " Apr 16 14:24:50.427694 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.426797 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-home" (OuterVolumeSpecName: "home") pod "5b51135e-33b7-4c21-a18c-a8035aa004fb" (UID: "5b51135e-33b7-4c21-a18c-a8035aa004fb"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:50.428232 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.428012 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-model-cache" (OuterVolumeSpecName: "model-cache") pod "5b51135e-33b7-4c21-a18c-a8035aa004fb" (UID: "5b51135e-33b7-4c21-a18c-a8035aa004fb"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:50.431862 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.431023 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b51135e-33b7-4c21-a18c-a8035aa004fb-kube-api-access-hkb2p" (OuterVolumeSpecName: "kube-api-access-hkb2p") pod "5b51135e-33b7-4c21-a18c-a8035aa004fb" (UID: "5b51135e-33b7-4c21-a18c-a8035aa004fb"). InnerVolumeSpecName "kube-api-access-hkb2p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:24:50.431862 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.431229 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-dshm" (OuterVolumeSpecName: "dshm") pod "5b51135e-33b7-4c21-a18c-a8035aa004fb" (UID: "5b51135e-33b7-4c21-a18c-a8035aa004fb"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:50.431862 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.431653 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b51135e-33b7-4c21-a18c-a8035aa004fb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5b51135e-33b7-4c21-a18c-a8035aa004fb" (UID: "5b51135e-33b7-4c21-a18c-a8035aa004fb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:24:50.499512 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.499398 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b51135e-33b7-4c21-a18c-a8035aa004fb" (UID: "5b51135e-33b7-4c21-a18c-a8035aa004fb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:50.527535 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.527490 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-home\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:24:50.527535 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.527536 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-model-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:24:50.527833 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.527562 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b51135e-33b7-4c21-a18c-a8035aa004fb-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:24:50.527833 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.527577 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hkb2p\" (UniqueName: \"kubernetes.io/projected/5b51135e-33b7-4c21-a18c-a8035aa004fb-kube-api-access-hkb2p\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:24:50.527833 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.527593 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:24:50.527833 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.527606 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b51135e-33b7-4c21-a18c-a8035aa004fb-dshm\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:24:50.994859 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.994820 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b51135e-33b7-4c21-a18c-a8035aa004fb" containerID="eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5" exitCode=0 Apr 16 14:24:50.995027 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.994900 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5b51135e-33b7-4c21-a18c-a8035aa004fb","Type":"ContainerDied","Data":"eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5"} Apr 16 14:24:50.995027 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.994930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5b51135e-33b7-4c21-a18c-a8035aa004fb","Type":"ContainerDied","Data":"0b5b8c685734e9fc588bf2da5501fac73d56003f79fb69ec7be88bf69df8f2ce"} Apr 16 14:24:50.995027 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.994946 2571 scope.go:117] "RemoveContainer" containerID="eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5" Apr 16 14:24:50.995027 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:50.994953 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:24:51.014592 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:51.014564 2571 scope.go:117] "RemoveContainer" containerID="14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5" Apr 16 14:24:51.026034 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:51.026008 2571 scope.go:117] "RemoveContainer" containerID="eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5" Apr 16 14:24:51.026499 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:24:51.026473 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5\": container with ID starting with eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5 not found: ID does not exist" containerID="eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5" Apr 16 14:24:51.026596 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:51.026512 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5"} err="failed to get container status \"eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5\": rpc error: code = NotFound desc = could not find container \"eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5\": container with ID starting with eb83e62e71dad413364564326a2b58ee11bbfbc4124aff99b331d70f6544a5f5 not found: ID does not exist" Apr 16 14:24:51.026596 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:51.026541 2571 scope.go:117] "RemoveContainer" containerID="14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5" Apr 16 14:24:51.026866 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:24:51.026835 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5\": container with ID starting with 14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5 not found: ID does not exist" containerID="14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5" Apr 16 14:24:51.026982 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:51.026867 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5"} err="failed to get container status \"14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5\": rpc error: code = NotFound desc = could not find container \"14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5\": container with ID starting with 14dffdfc180cdb5e64586910ffbeb676f34e241f850d77eb1eec658edfa2dfb5 not found: ID does not exist" Apr 16 14:24:51.028065 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:51.028042 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:24:51.034929 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:51.034899 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:24:51.759779 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:51.759733 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:24:51.956764 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:51.956722 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b51135e-33b7-4c21-a18c-a8035aa004fb" path="/var/lib/kubelet/pods/5b51135e-33b7-4c21-a18c-a8035aa004fb/volumes" Apr 16 14:24:52.774241 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.774189 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh"] Apr 16 14:24:52.774638 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.774486 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b51135e-33b7-4c21-a18c-a8035aa004fb" containerName="storage-initializer" Apr 16 14:24:52.774638 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.774499 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b51135e-33b7-4c21-a18c-a8035aa004fb" containerName="storage-initializer" Apr 16 14:24:52.774638 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.774511 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b51135e-33b7-4c21-a18c-a8035aa004fb" containerName="main" Apr 16 14:24:52.774638 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.774517 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b51135e-33b7-4c21-a18c-a8035aa004fb" containerName="main" Apr 16 14:24:52.774638 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.774573 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b51135e-33b7-4c21-a18c-a8035aa004fb" containerName="main" Apr 16 14:24:52.779893 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.779864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.782923 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.782893 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-2msdc\"" Apr 16 14:24:52.783964 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.783935 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 14:24:52.791711 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.791681 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh"] Apr 16 14:24:52.848302 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.848265 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.848541 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.848323 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45e7587e-a895-4484-be3c-0e56a44dcb13-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.848541 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.848428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.848541 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.848492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.848541 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.848530 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v9zz\" (UniqueName: \"kubernetes.io/projected/45e7587e-a895-4484-be3c-0e56a44dcb13-kube-api-access-5v9zz\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.848733 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.848573 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.949437 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.949394 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.949630 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.949459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.949630 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.949497 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5v9zz\" (UniqueName: \"kubernetes.io/projected/45e7587e-a895-4484-be3c-0e56a44dcb13-kube-api-access-5v9zz\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.949630 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.949532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.949630 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.949597 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.949868 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.949643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45e7587e-a895-4484-be3c-0e56a44dcb13-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.949868 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.949833 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.949960 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.949895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.950017 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.949996 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.950115 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.950096 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.952811 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.952777 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45e7587e-a895-4484-be3c-0e56a44dcb13-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:52.962756 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:52.962720 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v9zz\" (UniqueName: \"kubernetes.io/projected/45e7587e-a895-4484-be3c-0e56a44dcb13-kube-api-access-5v9zz\") pod \"scheduler-inline-config-test-kserve-router-scheduler-79d847dndh\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:53.091152 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:53.091033 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:53.233678 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:53.233637 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh"] Apr 16 14:24:53.237358 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:24:53.237319 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45e7587e_a895_4484_be3c_0e56a44dcb13.slice/crio-ed53ba0bdbeae4cbd370091abf9df986912d26e661ea21f31055e470cd218286 WatchSource:0}: Error finding container ed53ba0bdbeae4cbd370091abf9df986912d26e661ea21f31055e470cd218286: Status 404 returned error can't find the container with id ed53ba0bdbeae4cbd370091abf9df986912d26e661ea21f31055e470cd218286 Apr 16 14:24:54.009163 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:54.009124 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" event={"ID":"45e7587e-a895-4484-be3c-0e56a44dcb13","Type":"ContainerStarted","Data":"6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282"} Apr 16 14:24:54.009575 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:54.009171 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" event={"ID":"45e7587e-a895-4484-be3c-0e56a44dcb13","Type":"ContainerStarted","Data":"ed53ba0bdbeae4cbd370091abf9df986912d26e661ea21f31055e470cd218286"} Apr 16 14:24:55.019972 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:55.019931 2571 generic.go:358] "Generic (PLEG): container finished" podID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerID="6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282" exitCode=0 Apr 16 14:24:55.020583 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:55.019989 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" event={"ID":"45e7587e-a895-4484-be3c-0e56a44dcb13","Type":"ContainerDied","Data":"6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282"} Apr 16 14:24:56.025806 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:56.025768 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" event={"ID":"45e7587e-a895-4484-be3c-0e56a44dcb13","Type":"ContainerStarted","Data":"e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66"} Apr 16 14:24:56.026210 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:56.025814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" event={"ID":"45e7587e-a895-4484-be3c-0e56a44dcb13","Type":"ContainerStarted","Data":"a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01"} Apr 16 14:24:56.026210 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:56.025897 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:24:56.050802 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:24:56.050738 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" podStartSLOduration=4.050716801 podStartE2EDuration="4.050716801s" podCreationTimestamp="2026-04-16 14:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:24:56.047448285 +0000 UTC m=+1542.676816563" watchObservedRunningTime="2026-04-16 14:24:56.050716801 +0000 UTC m=+1542.680085082" Apr 16 14:25:01.759814 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:01.759760 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:25:03.091691 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:03.091648 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:25:03.092202 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:03.091703 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:25:03.094754 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:03.094723 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:25:04.058780 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:04.058741 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:25:11.760086 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:11.760011 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:25:21.760419 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:21.760367 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:25:25.062876 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:25.062844 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:25:26.305962 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:26.305918 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh"] Apr 16 14:25:26.306580 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:26.306270 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="main" containerID="cri-o://a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01" gracePeriod=30 Apr 16 14:25:26.306580 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:26.306284 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="tokenizer" containerID="cri-o://e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66" gracePeriod=30 Apr 16 14:25:27.134352 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.134308 2571 generic.go:358] "Generic (PLEG): container finished" podID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerID="a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01" exitCode=0 Apr 16 14:25:27.134574 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.134379 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" event={"ID":"45e7587e-a895-4484-be3c-0e56a44dcb13","Type":"ContainerDied","Data":"a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01"} Apr 16 14:25:27.758014 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.757988 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:25:27.761794 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.761772 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-tmp\") pod \"45e7587e-a895-4484-be3c-0e56a44dcb13\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " Apr 16 14:25:27.761944 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.761803 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-cache\") pod \"45e7587e-a895-4484-be3c-0e56a44dcb13\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " Apr 16 14:25:27.761944 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.761842 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-uds\") pod \"45e7587e-a895-4484-be3c-0e56a44dcb13\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " Apr 16 14:25:27.761944 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.761865 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v9zz\" (UniqueName: \"kubernetes.io/projected/45e7587e-a895-4484-be3c-0e56a44dcb13-kube-api-access-5v9zz\") pod \"45e7587e-a895-4484-be3c-0e56a44dcb13\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " Apr 16 14:25:27.761944 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.761890 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45e7587e-a895-4484-be3c-0e56a44dcb13-tls-certs\") pod \"45e7587e-a895-4484-be3c-0e56a44dcb13\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " Apr 16 14:25:27.761944 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.761913 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-kserve-provision-location\") pod \"45e7587e-a895-4484-be3c-0e56a44dcb13\" (UID: \"45e7587e-a895-4484-be3c-0e56a44dcb13\") " Apr 16 14:25:27.762224 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.762156 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "45e7587e-a895-4484-be3c-0e56a44dcb13" (UID: "45e7587e-a895-4484-be3c-0e56a44dcb13"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:25:27.762277 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.762238 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "45e7587e-a895-4484-be3c-0e56a44dcb13" (UID: "45e7587e-a895-4484-be3c-0e56a44dcb13"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:25:27.762321 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.762272 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "45e7587e-a895-4484-be3c-0e56a44dcb13" (UID: "45e7587e-a895-4484-be3c-0e56a44dcb13"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:25:27.762816 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.762790 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "45e7587e-a895-4484-be3c-0e56a44dcb13" (UID: "45e7587e-a895-4484-be3c-0e56a44dcb13"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:25:27.764154 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.764130 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e7587e-a895-4484-be3c-0e56a44dcb13-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "45e7587e-a895-4484-be3c-0e56a44dcb13" (UID: "45e7587e-a895-4484-be3c-0e56a44dcb13"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:25:27.764254 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.764198 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e7587e-a895-4484-be3c-0e56a44dcb13-kube-api-access-5v9zz" (OuterVolumeSpecName: "kube-api-access-5v9zz") pod "45e7587e-a895-4484-be3c-0e56a44dcb13" (UID: "45e7587e-a895-4484-be3c-0e56a44dcb13"). InnerVolumeSpecName "kube-api-access-5v9zz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:25:27.863086 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.862985 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-uds\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:25:27.863086 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.863015 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5v9zz\" (UniqueName: \"kubernetes.io/projected/45e7587e-a895-4484-be3c-0e56a44dcb13-kube-api-access-5v9zz\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:25:27.863086 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.863025 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45e7587e-a895-4484-be3c-0e56a44dcb13-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:25:27.863086 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.863037 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:25:27.863086 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.863045 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-tmp\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:25:27.863086 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:27.863056 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45e7587e-a895-4484-be3c-0e56a44dcb13-tokenizer-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:25:28.139611 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.139577 2571 generic.go:358] "Generic (PLEG): container finished" podID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerID="e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66" exitCode=0 Apr 16 14:25:28.139793 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.139658 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" event={"ID":"45e7587e-a895-4484-be3c-0e56a44dcb13","Type":"ContainerDied","Data":"e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66"} Apr 16 14:25:28.139793 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.139668 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" Apr 16 14:25:28.139793 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.139689 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh" event={"ID":"45e7587e-a895-4484-be3c-0e56a44dcb13","Type":"ContainerDied","Data":"ed53ba0bdbeae4cbd370091abf9df986912d26e661ea21f31055e470cd218286"} Apr 16 14:25:28.139793 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.139710 2571 scope.go:117] "RemoveContainer" containerID="e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66" Apr 16 14:25:28.148258 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.148234 2571 scope.go:117] "RemoveContainer" containerID="a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01" Apr 16 14:25:28.156821 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.156800 2571 scope.go:117] "RemoveContainer" containerID="6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282" Apr 16 14:25:28.168019 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.166850 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh"] Apr 16 14:25:28.168019 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.167972 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-79d847dndh"] Apr 16 14:25:28.174935 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.174908 2571 scope.go:117] "RemoveContainer" containerID="e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66" Apr 16 14:25:28.175296 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:25:28.175272 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66\": container with ID starting with e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66 not found: ID does not exist" containerID="e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66" Apr 16 14:25:28.175385 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.175304 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66"} err="failed to get container status \"e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66\": rpc error: code = NotFound desc = could not find container \"e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66\": container with ID starting with e2172ad35430a146aa1ae17cbf89e0a89399b7c02bb33747446b17cb99a03a66 not found: ID does not exist" Apr 16 14:25:28.175385 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.175328 2571 scope.go:117] "RemoveContainer" containerID="a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01" Apr 16 14:25:28.175586 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:25:28.175570 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01\": container with ID starting with a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01 not found: ID does not exist" containerID="a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01" Apr 16 14:25:28.175632 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.175590 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01"} err="failed to get container status \"a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01\": rpc error: code = NotFound desc = could not find container \"a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01\": container with ID starting with a41b1178b4099ef4030dc5d22b05f91714c9245539a3af8720e572af43237f01 not found: ID does not exist" Apr 16 14:25:28.175632 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.175605 2571 scope.go:117] "RemoveContainer" containerID="6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282" Apr 16 14:25:28.175888 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:25:28.175870 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282\": container with ID starting with 6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282 not found: ID does not exist" containerID="6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282" Apr 16 14:25:28.175931 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:28.175893 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282"} err="failed to get container status \"6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282\": rpc error: code = NotFound desc = could not find container \"6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282\": container with ID starting with 6aac17842cf06d1d0e2660665a6aa15d8b43ab54f80039e342d17302007c4282 not found: ID does not exist" Apr 16 14:25:29.956501 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:29.956455 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" path="/var/lib/kubelet/pods/45e7587e-a895-4484-be3c-0e56a44dcb13/volumes" Apr 16 14:25:31.760197 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:31.760147 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:25:41.759481 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:41.759424 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8001/health\": dial tcp 10.132.0.35:8001: connect: connection refused" Apr 16 14:25:51.769893 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:51.769861 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:25:51.789206 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:25:51.789173 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:26:04.401930 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:04.401832 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb"] Apr 16 14:26:04.402504 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:04.402443 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" containerID="cri-o://f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0" gracePeriod=30 Apr 16 14:26:34.402895 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.402816 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="llm-d-routing-sidecar" containerID="cri-o://0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec" gracePeriod=2 Apr 16 14:26:34.671831 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.671800 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb_0cbff47a-e793-45f0-b44a-63850b3bfbcf/main/0.log" Apr 16 14:26:34.672485 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.672469 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:26:34.707984 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.707951 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-home\") pod \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " Apr 16 14:26:34.708177 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.708025 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbff47a-e793-45f0-b44a-63850b3bfbcf-tls-certs\") pod \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " Apr 16 14:26:34.708177 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.708045 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-dshm\") pod \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " Apr 16 14:26:34.708177 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.708065 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-model-cache\") pod \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " Apr 16 14:26:34.708177 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.708101 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6chzm\" (UniqueName: \"kubernetes.io/projected/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kube-api-access-6chzm\") pod \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " Apr 16 14:26:34.708384 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.708266 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kserve-provision-location\") pod \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\" (UID: \"0cbff47a-e793-45f0-b44a-63850b3bfbcf\") " Apr 16 14:26:34.708384 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.708366 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-model-cache" (OuterVolumeSpecName: "model-cache") pod "0cbff47a-e793-45f0-b44a-63850b3bfbcf" (UID: "0cbff47a-e793-45f0-b44a-63850b3bfbcf"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:34.708474 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.708406 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-home" (OuterVolumeSpecName: "home") pod "0cbff47a-e793-45f0-b44a-63850b3bfbcf" (UID: "0cbff47a-e793-45f0-b44a-63850b3bfbcf"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:34.708604 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.708538 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-model-cache\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.708604 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.708562 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-home\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.711003 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.710700 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-dshm" (OuterVolumeSpecName: "dshm") pod "0cbff47a-e793-45f0-b44a-63850b3bfbcf" (UID: "0cbff47a-e793-45f0-b44a-63850b3bfbcf"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:34.711003 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.710984 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kube-api-access-6chzm" (OuterVolumeSpecName: "kube-api-access-6chzm") pod "0cbff47a-e793-45f0-b44a-63850b3bfbcf" (UID: "0cbff47a-e793-45f0-b44a-63850b3bfbcf"). InnerVolumeSpecName "kube-api-access-6chzm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:26:34.712268 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.712244 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbff47a-e793-45f0-b44a-63850b3bfbcf-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0cbff47a-e793-45f0-b44a-63850b3bfbcf" (UID: "0cbff47a-e793-45f0-b44a-63850b3bfbcf"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:26:34.765159 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.765115 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0cbff47a-e793-45f0-b44a-63850b3bfbcf" (UID: "0cbff47a-e793-45f0-b44a-63850b3bfbcf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:34.808982 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.808940 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbff47a-e793-45f0-b44a-63850b3bfbcf-tls-certs\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.808982 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.808972 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-dshm\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.808982 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.808982 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6chzm\" (UniqueName: \"kubernetes.io/projected/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kube-api-access-6chzm\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.808982 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:34.808991 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cbff47a-e793-45f0-b44a-63850b3bfbcf-kserve-provision-location\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:26:35.365386 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.365353 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb_0cbff47a-e793-45f0-b44a-63850b3bfbcf/main/0.log" Apr 16 14:26:35.366006 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.365983 2571 generic.go:358] "Generic (PLEG): container finished" podID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerID="f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0" exitCode=137 Apr 16 14:26:35.366006 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.366004 2571 generic.go:358] "Generic (PLEG): container finished" podID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerID="0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec" exitCode=0 Apr 16 14:26:35.366174 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.366062 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" Apr 16 14:26:35.366174 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.366090 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" event={"ID":"0cbff47a-e793-45f0-b44a-63850b3bfbcf","Type":"ContainerDied","Data":"f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0"} Apr 16 14:26:35.366174 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.366128 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" event={"ID":"0cbff47a-e793-45f0-b44a-63850b3bfbcf","Type":"ContainerDied","Data":"0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec"} Apr 16 14:26:35.366174 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.366140 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb" event={"ID":"0cbff47a-e793-45f0-b44a-63850b3bfbcf","Type":"ContainerDied","Data":"b9d19c4e27240ecfdaad99d48213b89c6193500c9e5d65e32c6a343507c80ea3"} Apr 16 14:26:35.366174 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.366156 2571 scope.go:117] "RemoveContainer" containerID="f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0" Apr 16 14:26:35.388173 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.388030 2571 scope.go:117] "RemoveContainer" containerID="de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b" Apr 16 14:26:35.388943 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.388920 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb"] Apr 16 14:26:35.392988 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.392961 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-79b87f74d8-wllcb"] Apr 16 14:26:35.453094 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.452926 2571 scope.go:117] "RemoveContainer" containerID="0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec" Apr 16 14:26:35.460980 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.460958 2571 scope.go:117] "RemoveContainer" containerID="f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0" Apr 16 14:26:35.461309 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:26:35.461285 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0\": container with ID starting with f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0 not found: ID does not exist" containerID="f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0" Apr 16 14:26:35.461404 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.461325 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0"} err="failed to get container status \"f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0\": rpc error: code = NotFound desc = could not find container \"f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0\": container with ID starting with f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0 not found: ID does not exist" Apr 16 14:26:35.461404 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.461352 2571 scope.go:117] "RemoveContainer" containerID="de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b" Apr 16 14:26:35.461651 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:26:35.461625 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b\": container with ID starting with de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b not found: ID does not exist" containerID="de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b" Apr 16 14:26:35.461725 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.461663 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b"} err="failed to get container status \"de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b\": rpc error: code = NotFound desc = could not find container \"de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b\": container with ID starting with de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b not found: ID does not exist" Apr 16 14:26:35.461725 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.461691 2571 scope.go:117] "RemoveContainer" containerID="0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec" Apr 16 14:26:35.461959 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:26:35.461933 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec\": container with ID starting with 0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec not found: ID does not exist" containerID="0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec" Apr 16 14:26:35.462014 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.461968 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec"} err="failed to get container status \"0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec\": rpc error: code = NotFound desc = could not find container \"0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec\": container with ID starting with 0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec not found: ID does not exist" Apr 16 14:26:35.462014 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.461993 2571 scope.go:117] "RemoveContainer" containerID="f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0" Apr 16 14:26:35.462268 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.462247 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0"} err="failed to get container status \"f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0\": rpc error: code = NotFound desc = could not find container \"f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0\": container with ID starting with f73b6b381e7c71475d0b92422e6b082fd51c6eab2e2d8a5a551d5b215cf0e8e0 not found: ID does not exist" Apr 16 14:26:35.462319 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.462269 2571 scope.go:117] "RemoveContainer" containerID="de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b" Apr 16 14:26:35.462528 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.462508 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b"} err="failed to get container status \"de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b\": rpc error: code = NotFound desc = could not find container \"de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b\": container with ID starting with de0a4c7d09c972974cc0fef4e469dc1c531935fd5f2438fa4427d5a3dfb21a9b not found: ID does not exist" Apr 16 14:26:35.462574 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.462531 2571 scope.go:117] "RemoveContainer" containerID="0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec" Apr 16 14:26:35.462774 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.462755 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec"} err="failed to get container status \"0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec\": rpc error: code = NotFound desc = could not find container \"0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec\": container with ID starting with 0606f128a9f7d2b3b339fe3be907e590d77e58647e445ceba0a82a5c6d9904ec not found: ID does not exist" Apr 16 14:26:35.954792 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:26:35.954747 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" path="/var/lib/kubelet/pods/0cbff47a-e793-45f0-b44a-63850b3bfbcf/volumes" Apr 16 14:29:13.980909 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:13.980877 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:29:13.982951 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:13.982914 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:29:21.563705 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.563663 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xc8sr/must-gather-rsgmr"] Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.563940 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="storage-initializer" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.563950 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="storage-initializer" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.563960 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="llm-d-routing-sidecar" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.563965 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="llm-d-routing-sidecar" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.563974 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="storage-initializer" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.563979 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="storage-initializer" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.563989 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="main" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.563994 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="main" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.564000 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="tokenizer" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.564006 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="tokenizer" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.564013 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.564019 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.564089 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="main" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.564097 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="main" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.564105 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cbff47a-e793-45f0-b44a-63850b3bfbcf" containerName="llm-d-routing-sidecar" Apr 16 14:29:21.564217 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.564111 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="45e7587e-a895-4484-be3c-0e56a44dcb13" containerName="tokenizer" Apr 16 14:29:21.567022 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.566998 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:29:21.570343 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.570317 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xc8sr\"/\"openshift-service-ca.crt\"" Apr 16 14:29:21.570536 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.570516 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xc8sr\"/\"default-dockercfg-5ftgb\"" Apr 16 14:29:21.571639 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.571618 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xc8sr\"/\"kube-root-ca.crt\"" Apr 16 14:29:21.575004 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.574983 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xc8sr/must-gather-rsgmr"] Apr 16 14:29:21.582902 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.582861 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b053850-00e6-4302-ab0e-2b8b1921ab7b-must-gather-output\") pod \"must-gather-rsgmr\" (UID: \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\") " pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:29:21.583157 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.582919 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5qf4\" (UniqueName: \"kubernetes.io/projected/6b053850-00e6-4302-ab0e-2b8b1921ab7b-kube-api-access-h5qf4\") pod \"must-gather-rsgmr\" (UID: \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\") " pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:29:21.683345 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.683301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5qf4\" (UniqueName: \"kubernetes.io/projected/6b053850-00e6-4302-ab0e-2b8b1921ab7b-kube-api-access-h5qf4\") pod \"must-gather-rsgmr\" (UID: \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\") " pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:29:21.683529 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.683401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b053850-00e6-4302-ab0e-2b8b1921ab7b-must-gather-output\") pod \"must-gather-rsgmr\" (UID: \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\") " pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:29:21.683721 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.683704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b053850-00e6-4302-ab0e-2b8b1921ab7b-must-gather-output\") pod \"must-gather-rsgmr\" (UID: \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\") " pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:29:21.695367 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.695329 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5qf4\" (UniqueName: \"kubernetes.io/projected/6b053850-00e6-4302-ab0e-2b8b1921ab7b-kube-api-access-h5qf4\") pod \"must-gather-rsgmr\" (UID: \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\") " pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:29:21.877134 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:21.877099 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:29:22.010249 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:22.010213 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xc8sr/must-gather-rsgmr"] Apr 16 14:29:22.013707 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:29:22.013670 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b053850_00e6_4302_ab0e_2b8b1921ab7b.slice/crio-3c5a299a6ce94f2d4dbcc80a0cd6b65dae672b147fde89166b27872598e420cc WatchSource:0}: Error finding container 3c5a299a6ce94f2d4dbcc80a0cd6b65dae672b147fde89166b27872598e420cc: Status 404 returned error can't find the container with id 3c5a299a6ce94f2d4dbcc80a0cd6b65dae672b147fde89166b27872598e420cc Apr 16 14:29:22.015360 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:22.015343 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:29:22.897267 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:22.897207 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" event={"ID":"6b053850-00e6-4302-ab0e-2b8b1921ab7b","Type":"ContainerStarted","Data":"3c5a299a6ce94f2d4dbcc80a0cd6b65dae672b147fde89166b27872598e420cc"} Apr 16 14:29:26.920932 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:26.920896 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" event={"ID":"6b053850-00e6-4302-ab0e-2b8b1921ab7b","Type":"ContainerStarted","Data":"116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd"} Apr 16 14:29:27.926112 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:27.926056 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" event={"ID":"6b053850-00e6-4302-ab0e-2b8b1921ab7b","Type":"ContainerStarted","Data":"0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25"} Apr 16 14:29:27.944007 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:27.943950 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" podStartSLOduration=2.243262773 podStartE2EDuration="6.943932712s" podCreationTimestamp="2026-04-16 14:29:21 +0000 UTC" firstStartedPulling="2026-04-16 14:29:22.015488772 +0000 UTC m=+1808.644857029" lastFinishedPulling="2026-04-16 14:29:26.716158697 +0000 UTC m=+1813.345526968" observedRunningTime="2026-04-16 14:29:27.941823137 +0000 UTC m=+1814.571191417" watchObservedRunningTime="2026-04-16 14:29:27.943932712 +0000 UTC m=+1814.573300994" Apr 16 14:29:51.732019 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:51.731972 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mdhsz_cf7faa71-231f-4467-a789-acd6da492013/discovery/0.log" Apr 16 14:29:52.611714 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:52.611682 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mdhsz_cf7faa71-231f-4467-a789-acd6da492013/discovery/0.log" Apr 16 14:29:53.532019 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:53.531985 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-f7hwv_17b6a0f3-a787-402a-b143-10829b107975/manager/0.log" Apr 16 14:29:53.576710 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:53.576675 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-b5dln_dc0bbac0-13c6-4a05-9912-955257610f3d/manager/0.log" Apr 16 14:29:55.023368 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:55.023334 2571 generic.go:358] "Generic (PLEG): container finished" podID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" containerID="116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd" exitCode=0 Apr 16 14:29:55.023907 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:55.023411 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" event={"ID":"6b053850-00e6-4302-ab0e-2b8b1921ab7b","Type":"ContainerDied","Data":"116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd"} Apr 16 14:29:55.023907 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:55.023853 2571 scope.go:117] "RemoveContainer" containerID="116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd" Apr 16 14:29:55.257368 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:29:55.257335 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xc8sr_must-gather-rsgmr_6b053850-00e6-4302-ab0e-2b8b1921ab7b/gather/0.log" Apr 16 14:30:00.741032 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:00.740997 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xc8sr/must-gather-rsgmr"] Apr 16 14:30:00.741447 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:00.741251 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" containerName="copy" containerID="cri-o://0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25" gracePeriod=2 Apr 16 14:30:00.743859 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:00.743801 2571 status_manager.go:895] "Failed to get status for pod" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" err="pods \"must-gather-rsgmr\" is forbidden: User \"system:node:ip-10-0-131-99.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xc8sr\": no relationship found between node 'ip-10-0-131-99.ec2.internal' and this object" Apr 16 14:30:00.745826 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:00.745473 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xc8sr/must-gather-rsgmr"] Apr 16 14:30:00.973635 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:00.973610 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xc8sr_must-gather-rsgmr_6b053850-00e6-4302-ab0e-2b8b1921ab7b/copy/0.log" Apr 16 14:30:00.973988 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:00.973975 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:30:00.977128 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:00.977099 2571 status_manager.go:895] "Failed to get status for pod" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" err="pods \"must-gather-rsgmr\" is forbidden: User \"system:node:ip-10-0-131-99.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xc8sr\": no relationship found between node 'ip-10-0-131-99.ec2.internal' and this object" Apr 16 14:30:01.044288 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.044205 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xc8sr_must-gather-rsgmr_6b053850-00e6-4302-ab0e-2b8b1921ab7b/copy/0.log" Apr 16 14:30:01.044523 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.044501 2571 generic.go:358] "Generic (PLEG): container finished" podID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" containerID="0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25" exitCode=143 Apr 16 14:30:01.044582 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.044550 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" Apr 16 14:30:01.044627 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.044554 2571 scope.go:117] "RemoveContainer" containerID="0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25" Apr 16 14:30:01.046940 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.046906 2571 status_manager.go:895] "Failed to get status for pod" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" err="pods \"must-gather-rsgmr\" is forbidden: User \"system:node:ip-10-0-131-99.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xc8sr\": no relationship found between node 'ip-10-0-131-99.ec2.internal' and this object" Apr 16 14:30:01.052182 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.052161 2571 scope.go:117] "RemoveContainer" containerID="116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd" Apr 16 14:30:01.063366 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.063332 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5qf4\" (UniqueName: \"kubernetes.io/projected/6b053850-00e6-4302-ab0e-2b8b1921ab7b-kube-api-access-h5qf4\") pod \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\" (UID: \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\") " Apr 16 14:30:01.063539 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.063439 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b053850-00e6-4302-ab0e-2b8b1921ab7b-must-gather-output\") pod \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\" (UID: \"6b053850-00e6-4302-ab0e-2b8b1921ab7b\") " Apr 16 14:30:01.065694 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.065650 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b053850-00e6-4302-ab0e-2b8b1921ab7b-kube-api-access-h5qf4" (OuterVolumeSpecName: "kube-api-access-h5qf4") pod "6b053850-00e6-4302-ab0e-2b8b1921ab7b" (UID: "6b053850-00e6-4302-ab0e-2b8b1921ab7b"). InnerVolumeSpecName "kube-api-access-h5qf4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:30:01.066825 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.066806 2571 scope.go:117] "RemoveContainer" containerID="0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25" Apr 16 14:30:01.067188 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:30:01.067154 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25\": container with ID starting with 0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25 not found: ID does not exist" containerID="0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25" Apr 16 14:30:01.067295 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.067200 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25"} err="failed to get container status \"0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25\": rpc error: code = NotFound desc = could not find container \"0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25\": container with ID starting with 0dcca969f4dd5e1a371ce9b0191a38d17b3c1cafd684d3ae13a2d69a8ffb2a25 not found: ID does not exist" Apr 16 14:30:01.067295 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.067224 2571 scope.go:117] "RemoveContainer" containerID="116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd" Apr 16 14:30:01.067556 ip-10-0-131-99 kubenswrapper[2571]: E0416 14:30:01.067522 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd\": container with ID starting with 116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd not found: ID does not exist" containerID="116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd" Apr 16 14:30:01.067621 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.067559 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd"} err="failed to get container status \"116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd\": rpc error: code = NotFound desc = could not find container \"116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd\": container with ID starting with 116e93a1253ba8746c68cc40315eac5452b901be3b283db48cb8efec067445fd not found: ID does not exist" Apr 16 14:30:01.069733 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.069706 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b053850-00e6-4302-ab0e-2b8b1921ab7b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6b053850-00e6-4302-ab0e-2b8b1921ab7b" (UID: "6b053850-00e6-4302-ab0e-2b8b1921ab7b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:01.164737 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.164683 2571 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6b053850-00e6-4302-ab0e-2b8b1921ab7b-must-gather-output\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:30:01.164737 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.164730 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h5qf4\" (UniqueName: \"kubernetes.io/projected/6b053850-00e6-4302-ab0e-2b8b1921ab7b-kube-api-access-h5qf4\") on node \"ip-10-0-131-99.ec2.internal\" DevicePath \"\"" Apr 16 14:30:01.228835 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.228803 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bcp9m_83635cb4-c000-4ff0-8ff5-171c0a1c00d0/global-pull-secret-syncer/0.log" Apr 16 14:30:01.350413 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.350323 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zvzcf_a32af74e-5db6-424e-85ec-f3c363b28eb5/konnectivity-agent/0.log" Apr 16 14:30:01.356399 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.356288 2571 status_manager.go:895] "Failed to get status for pod" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" pod="openshift-must-gather-xc8sr/must-gather-rsgmr" err="pods \"must-gather-rsgmr\" is forbidden: User \"system:node:ip-10-0-131-99.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xc8sr\": no relationship found between node 'ip-10-0-131-99.ec2.internal' and this object" Apr 16 14:30:01.398679 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.398640 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-99.ec2.internal_f5b5a9d82f93d048857d4c98e90f0fd3/haproxy/0.log" Apr 16 14:30:01.956574 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:01.956543 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" path="/var/lib/kubelet/pods/6b053850-00e6-4302-ab0e-2b8b1921ab7b/volumes" Apr 16 14:30:05.601937 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:05.601897 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-f7hwv_17b6a0f3-a787-402a-b143-10829b107975/manager/0.log" Apr 16 14:30:05.651902 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:05.651868 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-b5dln_dc0bbac0-13c6-4a05-9912-955257610f3d/manager/0.log" Apr 16 14:30:07.078944 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:07.078909 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mzqsp_0a21b3db-dd0e-44d3-8703-4ba945e6e96c/node-exporter/0.log" Apr 16 14:30:07.103132 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:07.103101 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mzqsp_0a21b3db-dd0e-44d3-8703-4ba945e6e96c/kube-rbac-proxy/0.log" Apr 16 14:30:07.125882 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:07.125856 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mzqsp_0a21b3db-dd0e-44d3-8703-4ba945e6e96c/init-textfile/0.log" Apr 16 14:30:10.372535 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.372496 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf"] Apr 16 14:30:10.372937 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.372797 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" containerName="gather" Apr 16 14:30:10.372937 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.372811 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" containerName="gather" Apr 16 14:30:10.372937 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.372829 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" containerName="copy" Apr 16 14:30:10.372937 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.372835 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" containerName="copy" Apr 16 14:30:10.372937 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.372881 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" containerName="gather" Apr 16 14:30:10.372937 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.372891 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b053850-00e6-4302-ab0e-2b8b1921ab7b" containerName="copy" Apr 16 14:30:10.379635 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.379604 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.383317 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.383285 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf"] Apr 16 14:30:10.383493 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.383466 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8g7hp\"/\"kube-root-ca.crt\"" Apr 16 14:30:10.383572 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.383529 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8g7hp\"/\"openshift-service-ca.crt\"" Apr 16 14:30:10.383572 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.383543 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8g7hp\"/\"default-dockercfg-9xm4f\"" Apr 16 14:30:10.434856 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.434812 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-sys\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.435047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.434895 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-lib-modules\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.435047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.434952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-podres\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.435047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.434988 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v629q\" (UniqueName: \"kubernetes.io/projected/75d46130-0917-4475-8d15-36bc06ac5e9a-kube-api-access-v629q\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.435047 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.435009 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-proc\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.535922 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.535891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-lib-modules\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.535922 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.535929 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-podres\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.536209 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.535950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v629q\" (UniqueName: \"kubernetes.io/projected/75d46130-0917-4475-8d15-36bc06ac5e9a-kube-api-access-v629q\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.536209 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.535971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-proc\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.536209 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.535992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-sys\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.536209 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.536084 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-proc\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.536209 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.536104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-sys\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.536209 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.536113 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-lib-modules\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.536209 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.536114 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75d46130-0917-4475-8d15-36bc06ac5e9a-podres\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.544930 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.544898 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v629q\" (UniqueName: \"kubernetes.io/projected/75d46130-0917-4475-8d15-36bc06ac5e9a-kube-api-access-v629q\") pod \"perf-node-gather-daemonset-xm7gf\" (UID: \"75d46130-0917-4475-8d15-36bc06ac5e9a\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.691675 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.691568 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:10.820690 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:10.820655 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf"] Apr 16 14:30:10.823733 ip-10-0-131-99 kubenswrapper[2571]: W0416 14:30:10.823695 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod75d46130_0917_4475_8d15_36bc06ac5e9a.slice/crio-04c952d781242a32a1dd91cfe1f4fe740573151a9230ad4ca15c26d421c9f29d WatchSource:0}: Error finding container 04c952d781242a32a1dd91cfe1f4fe740573151a9230ad4ca15c26d421c9f29d: Status 404 returned error can't find the container with id 04c952d781242a32a1dd91cfe1f4fe740573151a9230ad4ca15c26d421c9f29d Apr 16 14:30:11.083135 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:11.083100 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" event={"ID":"75d46130-0917-4475-8d15-36bc06ac5e9a","Type":"ContainerStarted","Data":"5cc4e71b894a574e2607618a6b02767a1e77993e6bd3faeb7b3e79e97486565d"} Apr 16 14:30:11.083135 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:11.083135 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" event={"ID":"75d46130-0917-4475-8d15-36bc06ac5e9a","Type":"ContainerStarted","Data":"04c952d781242a32a1dd91cfe1f4fe740573151a9230ad4ca15c26d421c9f29d"} Apr 16 14:30:11.083344 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:11.083212 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:11.100691 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:11.100637 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" podStartSLOduration=1.100619994 podStartE2EDuration="1.100619994s" podCreationTimestamp="2026-04-16 14:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:11.099362857 +0000 UTC m=+1857.728731136" watchObservedRunningTime="2026-04-16 14:30:11.100619994 +0000 UTC m=+1857.729988273" Apr 16 14:30:11.555765 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:11.555734 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v72fr_eaab260d-b8fe-47b6-8446-b16d19857d43/dns/0.log" Apr 16 14:30:11.602968 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:11.602932 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v72fr_eaab260d-b8fe-47b6-8446-b16d19857d43/kube-rbac-proxy/0.log" Apr 16 14:30:11.654065 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:11.654032 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l47f6_bb45c2a8-3222-492e-a359-cd27a52d6faa/dns-node-resolver/0.log" Apr 16 14:30:12.211469 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:12.211435 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h95cv_444c36ba-0722-4b97-88e0-a10913a4f6b4/node-ca/0.log" Apr 16 14:30:13.149258 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:13.149222 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mdhsz_cf7faa71-231f-4467-a789-acd6da492013/discovery/0.log" Apr 16 14:30:13.723881 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:13.723849 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wjbcm_aa74ab6f-55fb-4757-9677-130c7dc8c62c/serve-healthcheck-canary/0.log" Apr 16 14:30:14.198468 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:14.198433 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cw5lz_992a5177-6c6b-4d11-8f10-1218ec3dbd79/kube-rbac-proxy/0.log" Apr 16 14:30:14.229632 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:14.229599 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cw5lz_992a5177-6c6b-4d11-8f10-1218ec3dbd79/exporter/0.log" Apr 16 14:30:14.263188 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:14.263161 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cw5lz_992a5177-6c6b-4d11-8f10-1218ec3dbd79/extractor/0.log" Apr 16 14:30:17.007623 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:17.007591 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-x6qvc_9bbcefce-670d-4d32-a732-52ef39512af3/openshift-lws-operator/0.log" Apr 16 14:30:17.096133 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:17.096102 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-xm7gf" Apr 16 14:30:17.571413 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:17.571373 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-9bbf58456-xsttp_20cec3cb-f572-4622-8557-b5b5b2ce90e1/manager/0.log" Apr 16 14:30:17.649177 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:17.649139 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-647dc49bd9-5g4kh_aaf7ddcf-facf-4b2b-a412-430765f198d3/manager/0.log" Apr 16 14:30:24.067736 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.067708 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-52th9_497c5497-4184-4e77-90af-4b9edc13fa89/kube-multus-additional-cni-plugins/0.log" Apr 16 14:30:24.100515 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.100482 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-52th9_497c5497-4184-4e77-90af-4b9edc13fa89/egress-router-binary-copy/0.log" Apr 16 14:30:24.123285 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.123251 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-52th9_497c5497-4184-4e77-90af-4b9edc13fa89/cni-plugins/0.log" Apr 16 14:30:24.146188 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.146154 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-52th9_497c5497-4184-4e77-90af-4b9edc13fa89/bond-cni-plugin/0.log" Apr 16 14:30:24.171903 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.171871 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-52th9_497c5497-4184-4e77-90af-4b9edc13fa89/routeoverride-cni/0.log" Apr 16 14:30:24.197327 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.197300 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-52th9_497c5497-4184-4e77-90af-4b9edc13fa89/whereabouts-cni-bincopy/0.log" Apr 16 14:30:24.218973 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.218943 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-52th9_497c5497-4184-4e77-90af-4b9edc13fa89/whereabouts-cni/0.log" Apr 16 14:30:24.668431 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.668391 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hr2bh_dc274749-ec6f-4398-a91a-e94036d6a048/kube-multus/0.log" Apr 16 14:30:24.801864 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.801831 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gg599_97f73dc3-4dcf-4643-8dc6-cd6e6418679b/network-metrics-daemon/0.log" Apr 16 14:30:24.820628 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:24.820599 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gg599_97f73dc3-4dcf-4643-8dc6-cd6e6418679b/kube-rbac-proxy/0.log" Apr 16 14:30:25.648754 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:25.648724 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-controller/0.log" Apr 16 14:30:25.667485 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:25.667456 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/0.log" Apr 16 14:30:25.684460 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:25.684432 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovn-acl-logging/1.log" Apr 16 14:30:25.707321 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:25.707286 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/kube-rbac-proxy-node/0.log" Apr 16 14:30:25.731026 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:25.730990 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:30:25.750629 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:25.750595 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/northd/0.log" Apr 16 14:30:25.772591 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:25.772561 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/nbdb/0.log" Apr 16 14:30:25.796457 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:25.796425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/sbdb/0.log" Apr 16 14:30:25.976810 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:25.976728 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q9n5_00f5f350-f965-4f31-9400-648a4573f987/ovnkube-controller/0.log" Apr 16 14:30:27.771055 ip-10-0-131-99 kubenswrapper[2571]: I0416 14:30:27.771020 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ptrgm_a45bf770-bb2a-4a8f-8fa8-60cb36789e8c/network-check-target-container/0.log"