Apr 24 21:15:46.270694 ip-10-0-132-159 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:15:46.727622 ip-10-0-132-159 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:15:46.727622 ip-10-0-132-159 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:15:46.727622 ip-10-0-132-159 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:15:46.727622 ip-10-0-132-159 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:15:46.727622 ip-10-0-132-159 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:15:46.729370 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.729277 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:15:46.733671 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733650 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:46.733671 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733669 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733676 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733680 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733684 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733687 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733690 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733693 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733696 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733698 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733701 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733704 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733707 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733709 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733718 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733722 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733725 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733728 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733730 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733733 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:46.733734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733736 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733739 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733742 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733744 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733747 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733750 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733753 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733755 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733758 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733761 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733763 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733766 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733768 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733776 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733779 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733781 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733784 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733786 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733789 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733791 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:46.734189 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733794 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733797 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733799 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733802 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733804 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733807 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733809 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733812 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733814 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733817 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733819 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733821 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733824 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733826 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733830 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733832 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733836 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733838 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733841 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733844 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:46.734706 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733847 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733849 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733852 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733854 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733857 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733859 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733862 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733864 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733866 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733869 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733872 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733874 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733877 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733880 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733882 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733885 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733887 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733891 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733893 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:46.735234 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733896 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733898 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733901 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733903 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733905 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733908 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.733910 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734302 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734308 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734311 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734315 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734317 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734320 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734323 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734325 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734328 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734331 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734333 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734336 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734338 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:46.735725 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734341 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734344 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734346 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734349 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734351 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734354 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734357 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734360 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734362 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734365 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734367 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734370 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734372 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734375 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734377 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734380 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734382 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734386 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734403 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734407 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:46.736246 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734410 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734413 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734416 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734419 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734422 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734425 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734428 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734431 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734434 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734436 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734439 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734443 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734445 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734448 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734450 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734453 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734455 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734457 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734460 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:46.736849 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734463 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734466 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734468 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734471 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734473 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734475 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734478 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734480 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734483 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734486 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734489 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734492 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734496 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734499 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734502 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734504 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734507 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734510 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734512 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734514 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:46.737352 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734518 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734521 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734523 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734526 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734528 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734531 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734533 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734536 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734538 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734540 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734543 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734545 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734548 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.734551 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735310 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735318 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735325 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735329 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735334 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735337 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735346 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:15:46.737866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735351 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735354 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735357 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735361 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735364 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735368 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735371 2581 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735374 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735376 2581 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735379 2581 flags.go:64] FLAG: --cloud-config="" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735382 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735385 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735401 2581 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735404 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735407 2581 flags.go:64] FLAG: --config-dir="" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735411 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735414 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735418 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735421 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735425 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735428 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735431 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735434 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735437 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735440 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:15:46.738373 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735444 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735448 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735451 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735454 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735456 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735459 2581 flags.go:64] FLAG: --enable-server="true" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735462 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735466 2581 flags.go:64] FLAG: --event-burst="100" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735469 2581 flags.go:64] FLAG: --event-qps="50" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735472 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735476 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735479 2581 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735483 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735486 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735489 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735492 2581 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735495 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735498 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735501 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735504 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735507 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735510 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735513 2581 flags.go:64] FLAG: --feature-gates="" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735516 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735519 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:15:46.739024 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735522 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735525 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735528 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735531 2581 flags.go:64] FLAG: --help="false" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735534 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-132-159.ec2.internal" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735537 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735540 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735543 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735547 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735551 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735554 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735557 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735560 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735563 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735566 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735569 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735572 2581 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735575 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735578 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735581 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735583 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735586 2581 flags.go:64] FLAG: --lock-file="" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735590 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735593 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:15:46.739672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735596 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735601 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735604 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735607 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735610 2581 flags.go:64] FLAG: --logging-format="text" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735613 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735616 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735619 2581 flags.go:64] FLAG: --manifest-url="" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735622 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735626 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735629 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735633 2581 flags.go:64] FLAG: --max-pods="110" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735636 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735639 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735642 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735645 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735648 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735652 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735655 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735672 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735675 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735678 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735681 2581 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:15:46.740256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735684 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735690 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735693 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735696 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735699 2581 flags.go:64] FLAG: --port="10250" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735702 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735705 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0da58a660f3285333" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735708 2581 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735711 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735714 2581 flags.go:64] FLAG: --register-node="true" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735717 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735719 2581 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735723 2581 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735726 2581 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735729 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735732 2581 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735736 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735739 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735742 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735745 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735748 2581 flags.go:64] FLAG: --runonce="false" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735751 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735754 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735757 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735760 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735763 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:15:46.740827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735766 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735769 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735772 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735775 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735778 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735781 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735784 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735787 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735792 2581 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735795 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735800 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735803 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735806 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735811 2581 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735813 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735816 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735819 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735822 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735825 2581 flags.go:64] FLAG: --v="2" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735830 2581 flags.go:64] FLAG: --version="false" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735833 2581 flags.go:64] FLAG: --vmodule="" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735838 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.735841 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735936 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:46.741466 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735940 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735943 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735945 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735948 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735951 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735954 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735956 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735958 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735961 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735964 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735967 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735970 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735972 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735975 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735978 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735980 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735984 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735987 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735990 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735992 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:46.742037 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735995 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.735997 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736000 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736004 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736007 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736010 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736014 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736016 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736019 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736021 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736024 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736028 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736031 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736034 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736037 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736040 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736042 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736045 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736047 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:46.742579 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736050 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736053 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736055 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736058 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736061 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736064 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736066 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736069 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736072 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736075 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736078 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736080 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736083 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736086 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736088 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736090 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736093 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736096 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736098 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736101 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:46.743095 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736103 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736106 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736109 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736111 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736114 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736116 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736118 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736121 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736124 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736126 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736128 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736131 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736133 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736139 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736141 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736144 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736147 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736150 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736152 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736155 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:46.743991 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736157 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:46.744620 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736161 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:46.744620 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736163 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:46.744620 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736166 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:46.744620 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736168 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:46.744620 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.736171 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:46.744620 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.736795 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:15:46.744850 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.744832 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:15:46.744883 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.744851 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:15:46.744915 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744899 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:46.744915 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744904 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:46.744915 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744907 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:46.744915 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744911 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:46.744915 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744914 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:46.744915 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744917 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744920 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744924 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744927 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744929 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744932 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744935 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744938 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744941 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744943 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744946 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744949 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744952 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744954 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744957 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744959 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744962 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744965 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744968 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744971 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:46.745066 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744974 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744976 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744979 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744981 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744984 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744987 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744991 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744995 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.744998 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745001 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745004 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745006 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745009 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745011 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745014 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745017 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745020 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745022 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745025 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745028 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:46.745587 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745030 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745033 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745035 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745038 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745040 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745043 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745046 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745049 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745051 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745054 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745056 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745059 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745061 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745064 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745066 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745069 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745072 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745075 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745077 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745080 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:46.746120 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745083 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745087 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745091 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745094 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745097 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745100 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745104 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745107 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745110 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745112 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745115 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745120 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745123 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745126 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745128 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745131 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745134 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745136 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745139 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:46.746707 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745141 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745144 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.745149 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745241 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745245 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745248 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745251 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745254 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745257 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745260 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745262 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745265 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745267 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745270 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745273 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745275 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:46.747212 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745278 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745280 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745283 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745287 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745290 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745293 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745295 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745298 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745301 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745304 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745307 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745310 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745312 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745315 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745318 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745320 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745323 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745325 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745328 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745330 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:46.747630 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745333 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745335 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745338 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745340 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745343 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745345 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745348 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745350 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745353 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745355 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745358 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745360 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745363 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745365 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745368 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745371 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745374 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745376 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745379 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745382 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:46.748143 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745384 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745387 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745407 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745411 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745415 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745418 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745421 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745423 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745426 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745429 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745431 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745434 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745437 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745439 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745442 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745444 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745446 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745450 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745455 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:46.748652 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745459 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745461 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745464 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745467 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745469 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745472 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745475 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745477 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745480 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745483 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745485 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745488 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745490 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:46.745493 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.745497 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:15:46.749128 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.746276 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:15:46.749600 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.749108 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:15:46.750067 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.750055 2581 server.go:1019] "Starting client certificate rotation" Apr 24 21:15:46.750187 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.750167 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:15:46.750348 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.750336 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:15:46.777843 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.777817 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:15:46.781339 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.781317 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:15:46.798240 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.798215 2581 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:15:46.804172 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.804156 2581 log.go:25] "Validated CRI v1 image API" Apr 24 21:15:46.805475 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.805461 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:15:46.809924 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.809905 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:15:46.812081 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.812061 2581 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 ac657d59-599b-4373-a15b-79fb1064475d:/dev/nvme0n1p4 c9f5d840-447f-4081-9d95-3c18693b7575:/dev/nvme0n1p3] Apr 24 21:15:46.812145 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.812081 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:15:46.819467 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.819174 2581 manager.go:217] Machine: {Timestamp:2026-04-24 21:15:46.816736437 +0000 UTC m=+0.425875406 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3116689 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ab521847eded50f7fbecca14339da SystemUUID:ec2ab521-847e-ded5-0f7f-becca14339da BootID:c2504563-74c9-47f9-b60e-484462aa3c52 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d9:09:34:3f:5f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d9:09:34:3f:5f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:ed:87:23:b4:b9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:15:46.819467 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.819456 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:15:46.819613 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.819588 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:15:46.821719 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.821697 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:15:46.821856 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.821722 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-159.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:15:46.821901 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.821864 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:15:46.821901 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.821873 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:15:46.821901 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.821887 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:15:46.821901 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.821901 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:15:46.823402 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.823379 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:15:46.823519 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.823509 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:15:46.824387 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.824367 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xk8lv" Apr 24 21:15:46.826054 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.826044 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:15:46.826110 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.826058 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:15:46.826110 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.826070 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:15:46.826110 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.826079 2581 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:15:46.826110 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.826087 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:15:46.827256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.827238 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:15:46.827256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.827258 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:15:46.830450 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.830425 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:15:46.831651 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.831635 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xk8lv" Apr 24 21:15:46.831960 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.831945 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:15:46.833894 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833878 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:15:46.833942 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833901 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:15:46.833942 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833911 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:15:46.833942 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833919 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:15:46.833942 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833928 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:15:46.833942 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833936 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:15:46.833942 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833945 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:15:46.834111 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833953 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:15:46.834111 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833963 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:15:46.834111 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833974 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:15:46.834111 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833986 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:15:46.834111 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.833999 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:15:46.836227 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.836211 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:15:46.836227 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.836227 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:15:46.838709 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.838695 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:46.839729 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.839714 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:15:46.839809 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.839751 2581 server.go:1295] "Started kubelet" Apr 24 21:15:46.839866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.839821 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:15:46.839898 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.839860 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:15:46.839935 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.839915 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:15:46.840519 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.840501 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:46.840633 ip-10-0-132-159 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:15:46.841619 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.841600 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:15:46.841693 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.841663 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:15:46.844073 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.844041 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-159.ec2.internal" not found Apr 24 21:15:46.847689 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.847670 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:15:46.847882 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.847861 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:15:46.848657 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.848443 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:15:46.848657 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.848445 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:15:46.848657 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.848657 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:15:46.848839 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:46.848663 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 24 21:15:46.848839 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.848753 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:15:46.848839 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.848762 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:15:46.849840 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.849804 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:46.849933 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.849922 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:15:46.849992 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.849940 2581 factory.go:55] Registering systemd factory Apr 24 21:15:46.849992 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.849950 2581 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:15:46.850189 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.850170 2581 factory.go:153] Registering CRI-O factory Apr 24 21:15:46.850189 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.850190 2581 factory.go:223] Registration of the crio container factory successfully Apr 24 21:15:46.850323 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.850214 2581 factory.go:103] Registering Raw factory Apr 24 21:15:46.850323 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.850228 2581 manager.go:1196] Started watching for new ooms in manager Apr 24 21:15:46.850323 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:46.850296 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:15:46.853663 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.852592 2581 manager.go:319] Starting recovery of all containers Apr 24 21:15:46.853663 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:46.852840 2581 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-159.ec2.internal\" not found" node="ip-10-0-132-159.ec2.internal" Apr 24 21:15:46.858998 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.858972 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-159.ec2.internal" not found Apr 24 21:15:46.863791 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.863777 2581 manager.go:324] Recovery completed Apr 24 21:15:46.865187 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:46.865160 2581 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 21:15:46.867995 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.867983 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:46.871605 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.871590 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:46.871706 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.871615 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:46.871706 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.871628 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:46.872197 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.872182 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:15:46.872241 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.872198 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:15:46.872241 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.872215 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:15:46.874721 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.874710 2581 policy_none.go:49] "None policy: Start" Apr 24 21:15:46.874757 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.874725 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:15:46.874757 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.874734 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.911956 2581 manager.go:341] "Starting Device Plugin manager" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:46.911985 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.911994 2581 server.go:85] "Starting device plugin registration server" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.912230 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.912243 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.912344 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.912434 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.912442 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:46.912903 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:46.912931 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-159.ec2.internal\" not found" Apr 24 21:15:46.920705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.918592 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-159.ec2.internal" not found Apr 24 21:15:46.943929 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.943893 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:15:46.945068 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.945044 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:15:46.945168 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.945074 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:15:46.945168 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.945092 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:15:46.945168 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.945100 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:15:46.945168 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:46.945136 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:15:46.948420 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:46.948386 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:47.013369 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.013302 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:47.014192 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.014176 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:47.014280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.014205 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:47.014280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.014216 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:47.014280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.014248 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.024258 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.024241 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.024363 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:47.024268 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-159.ec2.internal\": node \"ip-10-0-132-159.ec2.internal\" not found" Apr 24 21:15:47.045605 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.045576 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal"] Apr 24 21:15:47.048297 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.048284 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.050093 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.050074 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.050166 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.050102 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.050166 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.050120 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baec555e5e2b442b2cad3d99698ce3db-config\") pod \"kube-apiserver-proxy-ip-10-0-132-159.ec2.internal\" (UID: \"baec555e5e2b442b2cad3d99698ce3db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.050349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.050334 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.050426 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.050362 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.058893 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.058875 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:15:47.059605 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.059593 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.073988 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.073966 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.073988 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.073981 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:15:47.078211 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.078196 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.086182 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.086169 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:15:47.086233 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:47.086203 2581 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-proxy-ip-10-0-132-159.ec2.internal\" already exists" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.096490 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.096472 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:15:47.096539 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:47.096510 2581 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.150753 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.150726 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baec555e5e2b442b2cad3d99698ce3db-config\") pod \"kube-apiserver-proxy-ip-10-0-132-159.ec2.internal\" (UID: \"baec555e5e2b442b2cad3d99698ce3db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.150753 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.150754 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.150930 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.150775 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.150930 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.150800 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.150930 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.150826 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baec555e5e2b442b2cad3d99698ce3db-config\") pod \"kube-apiserver-proxy-ip-10-0-132-159.ec2.internal\" (UID: \"baec555e5e2b442b2cad3d99698ce3db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.150930 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.150833 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.388151 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.388127 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.399866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.399843 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 24 21:15:47.752279 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.752199 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:15:47.752974 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.752341 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:15:47.752974 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.752382 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:15:47.752974 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.752416 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:15:47.827345 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.827220 2581 apiserver.go:52] "Watching apiserver" Apr 24 21:15:47.833695 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.833660 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:10:46 +0000 UTC" deadline="2028-01-10 09:44:58.29912833 +0000 UTC" Apr 24 21:15:47.833695 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.833685 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15012h29m10.465446845s" Apr 24 21:15:47.836631 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.836612 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:15:47.838656 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.838624 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-w6vsz","openshift-cluster-node-tuning-operator/tuned-stjmc","openshift-dns/node-resolver-tc6g6","openshift-image-registry/node-ca-flzqj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal","openshift-network-operator/iptables-alerter-qrkcn","openshift-ovn-kubernetes/ovnkube-node-dzbzn","kube-system/konnectivity-agent-wx7v9","kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2","openshift-multus/multus-additional-cni-plugins-qmnsd","openshift-multus/multus-n8xkn","openshift-multus/network-metrics-daemon-mf254"] Apr 24 21:15:47.844132 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.844109 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:47.844248 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:47.844184 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:15:47.846233 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.846210 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.846382 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.846367 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.848046 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.848030 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:15:47.848622 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.848376 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:15:47.848622 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.848441 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-k64g9\"" Apr 24 21:15:47.848622 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.848453 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.848622 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.848605 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:15:47.848862 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.848780 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:15:47.848914 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.848889 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:15:47.849172 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.849153 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6ppx7\"" Apr 24 21:15:47.850672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.850654 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.851170 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.850813 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:15:47.851170 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.850851 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:15:47.851170 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.851025 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:15:47.851170 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.851121 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-btscm\"" Apr 24 21:15:47.852289 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.852270 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:15:47.852951 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.852931 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.853058 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853006 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mct2p\"" Apr 24 21:15:47.853127 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853091 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:15:47.853266 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853240 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-systemd\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853359 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853296 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-lib-modules\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853359 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853328 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cff831a4-3dde-4185-b6ac-264f7592353a-etc-tuned\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853359 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853350 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cff831a4-3dde-4185-b6ac-264f7592353a-tmp\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853520 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853377 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:15:47.853520 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853373 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd7tx\" (UniqueName: \"kubernetes.io/projected/8f5835d0-33c0-4340-bfe0-67872e19c79e-kube-api-access-fd7tx\") pod \"node-ca-flzqj\" (UID: \"8f5835d0-33c0-4340-bfe0-67872e19c79e\") " pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.853520 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853432 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-run\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853520 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-sys\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853520 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853470 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-var-lib-kubelet\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853520 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853505 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2r9\" (UniqueName: \"kubernetes.io/projected/cff831a4-3dde-4185-b6ac-264f7592353a-kube-api-access-vh2r9\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853532 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knm9m\" (UniqueName: \"kubernetes.io/projected/a8617564-8813-486b-aaeb-9fd4ef61ca2f-kube-api-access-knm9m\") pod \"node-resolver-tc6g6\" (UID: \"a8617564-8813-486b-aaeb-9fd4ef61ca2f\") " pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853561 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/20f418c6-9af5-4d97-ac8d-065d25b5b429-iptables-alerter-script\") pod \"iptables-alerter-qrkcn\" (UID: \"20f418c6-9af5-4d97-ac8d-065d25b5b429\") " pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853588 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20f418c6-9af5-4d97-ac8d-065d25b5b429-host-slash\") pod \"iptables-alerter-qrkcn\" (UID: \"20f418c6-9af5-4d97-ac8d-065d25b5b429\") " pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853618 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853642 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-sysconfig\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853677 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-sysctl-conf\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853704 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f5835d0-33c0-4340-bfe0-67872e19c79e-host\") pod \"node-ca-flzqj\" (UID: \"8f5835d0-33c0-4340-bfe0-67872e19c79e\") " pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853741 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-kubernetes\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853764 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-sysctl-d\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.853799 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853799 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-host\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.854191 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853821 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a8617564-8813-486b-aaeb-9fd4ef61ca2f-hosts-file\") pod \"node-resolver-tc6g6\" (UID: \"a8617564-8813-486b-aaeb-9fd4ef61ca2f\") " pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.854191 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853844 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8617564-8813-486b-aaeb-9fd4ef61ca2f-tmp-dir\") pod \"node-resolver-tc6g6\" (UID: \"a8617564-8813-486b-aaeb-9fd4ef61ca2f\") " pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.854191 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853865 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f5835d0-33c0-4340-bfe0-67872e19c79e-serviceca\") pod \"node-ca-flzqj\" (UID: \"8f5835d0-33c0-4340-bfe0-67872e19c79e\") " pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.854191 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853921 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhwtd\" (UniqueName: \"kubernetes.io/projected/20f418c6-9af5-4d97-ac8d-065d25b5b429-kube-api-access-mhwtd\") pod \"iptables-alerter-qrkcn\" (UID: \"20f418c6-9af5-4d97-ac8d-065d25b5b429\") " pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.854191 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.853948 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-modprobe-d\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.855044 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.855028 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:15:47.855348 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.855328 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:15:47.855454 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.855369 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:15:47.855936 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.855918 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-r4rp4\"" Apr 24 21:15:47.856005 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.855964 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:15:47.856064 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.856005 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:15:47.856064 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.856022 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:15:47.856064 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.856023 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:15:47.856957 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.856942 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-s5q26\"" Apr 24 21:15:47.856957 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.856948 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:15:47.858153 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.857659 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:47.858153 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.857948 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:15:47.859769 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.859725 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pmvlf\"" Apr 24 21:15:47.860510 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.860284 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:15:47.860510 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.860292 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:15:47.860510 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.860407 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:15:47.860510 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.860408 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:15:47.860750 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.860699 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:47.862551 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.862532 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:15:47.862712 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.862697 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:15:47.862988 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.862970 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.863086 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.863016 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:15:47.863086 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.863053 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cx7l9\"" Apr 24 21:15:47.863212 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.863109 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:15:47.863212 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.863161 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:15:47.865062 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.864996 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:15:47.865062 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.865048 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bsw77\"" Apr 24 21:15:47.865292 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.865275 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:47.865361 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:47.865331 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:15:47.878828 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.878809 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dhpbc" Apr 24 21:15:47.885982 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.885964 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dhpbc" Apr 24 21:15:47.936734 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:47.936691 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod906f6cca8711ebeed3b778a79317b11c.slice/crio-a68be585b1edcce5f77f95285eaf06bd0b8ab0624cee11e00ab02f99c6d3c788 WatchSource:0}: Error finding container a68be585b1edcce5f77f95285eaf06bd0b8ab0624cee11e00ab02f99c6d3c788: Status 404 returned error can't find the container with id a68be585b1edcce5f77f95285eaf06bd0b8ab0624cee11e00ab02f99c6d3c788 Apr 24 21:15:47.942458 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.942438 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:15:47.948127 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.948091 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" event={"ID":"906f6cca8711ebeed3b778a79317b11c","Type":"ContainerStarted","Data":"a68be585b1edcce5f77f95285eaf06bd0b8ab0624cee11e00ab02f99c6d3c788"} Apr 24 21:15:47.949321 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.949304 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:15:47.954789 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.954762 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-run-systemd\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.954789 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.954808 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c1d5671-39e8-4826-af5d-f49631e0ece2-ovnkube-config\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.954980 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.954845 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-modprobe-d\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.954980 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.954873 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cff831a4-3dde-4185-b6ac-264f7592353a-tmp\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.954980 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.954960 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-system-cni-dir\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.955118 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.954995 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-multus-cni-dir\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.955118 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955027 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-var-lib-kubelet\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955118 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955055 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/20f418c6-9af5-4d97-ac8d-065d25b5b429-iptables-alerter-script\") pod \"iptables-alerter-qrkcn\" (UID: \"20f418c6-9af5-4d97-ac8d-065d25b5b429\") " pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.955118 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955084 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-system-cni-dir\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:47.955118 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955103 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-var-lib-kubelet\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955118 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955114 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a6bc573-bfae-4ef4-a14b-3d3958d53365-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:47.955362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955103 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-modprobe-d\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955152 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-sysconfig\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955176 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:15:47.955362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955194 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-sysctl-conf\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955287 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-os-release\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:47.955362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955315 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-sysctl-conf\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955323 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw6rp\" (UniqueName: \"kubernetes.io/projected/2a6bc573-bfae-4ef4-a14b-3d3958d53365-kube-api-access-qw6rp\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:47.955362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955324 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-sysconfig\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955351 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-cnibin\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955378 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-multus-socket-dir-parent\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955421 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/85b37092-0856-40bf-ad2e-32b72caa332b-multus-daemon-config\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955445 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-etc-kubernetes\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955463 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-kubernetes\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955496 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85b37092-0856-40bf-ad2e-32b72caa332b-cni-binary-copy\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955514 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-kubernetes\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955528 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-sys-fs\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955555 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcl2g\" (UniqueName: \"kubernetes.io/projected/50116296-9c0c-4212-b36a-62c335f13209-kube-api-access-dcl2g\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955578 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-systemd-units\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955603 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955670 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cff831a4-3dde-4185-b6ac-264f7592353a-etc-tuned\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955711 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a6bc573-bfae-4ef4-a14b-3d3958d53365-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955736 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-registration-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955760 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-etc-selinux\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955786 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knm9m\" (UniqueName: \"kubernetes.io/projected/a8617564-8813-486b-aaeb-9fd4ef61ca2f-kube-api-access-knm9m\") pod \"node-resolver-tc6g6\" (UID: \"a8617564-8813-486b-aaeb-9fd4ef61ca2f\") " pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.955807 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955810 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-os-release\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955817 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/20f418c6-9af5-4d97-ac8d-065d25b5b429-iptables-alerter-script\") pod \"iptables-alerter-qrkcn\" (UID: \"20f418c6-9af5-4d97-ac8d-065d25b5b429\") " pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955833 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-device-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955859 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-cnibin\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.955896 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-var-lib-cni-multus\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956157 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2aa70b5d-a245-4f50-adbe-ce8e71716842-konnectivity-ca\") pod \"konnectivity-agent-wx7v9\" (UID: \"2aa70b5d-a245-4f50-adbe-ce8e71716842\") " pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956193 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-sysctl-d\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956223 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-host\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956247 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8617564-8813-486b-aaeb-9fd4ef61ca2f-tmp-dir\") pod \"node-resolver-tc6g6\" (UID: \"a8617564-8813-486b-aaeb-9fd4ef61ca2f\") " pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956269 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f5835d0-33c0-4340-bfe0-67872e19c79e-serviceca\") pod \"node-ca-flzqj\" (UID: \"8f5835d0-33c0-4340-bfe0-67872e19c79e\") " pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956294 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a6bc573-bfae-4ef4-a14b-3d3958d53365-cni-binary-copy\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956316 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-run-netns\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956339 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c1d5671-39e8-4826-af5d-f49631e0ece2-ovn-node-metrics-cert\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956362 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-systemd\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956379 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-sysctl-d\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956444 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-host\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956488 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-lib-modules\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.956550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956383 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-lib-modules\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956537 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20f418c6-9af5-4d97-ac8d-065d25b5b429-host-slash\") pod \"iptables-alerter-qrkcn\" (UID: \"20f418c6-9af5-4d97-ac8d-065d25b5b429\") " pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956594 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-hostroot\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956657 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20f418c6-9af5-4d97-ac8d-065d25b5b429-host-slash\") pod \"iptables-alerter-qrkcn\" (UID: \"20f418c6-9af5-4d97-ac8d-065d25b5b429\") " pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956751 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-etc-systemd\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956786 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-multus-conf-dir\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956833 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqtnc\" (UniqueName: \"kubernetes.io/projected/85b37092-0856-40bf-ad2e-32b72caa332b-kube-api-access-hqtnc\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956885 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-socket-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956910 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-var-lib-kubelet\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956932 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ffw5\" (UniqueName: \"kubernetes.io/projected/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-kube-api-access-8ffw5\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956973 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-kubelet\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.956996 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-var-lib-openvswitch\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957005 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f5835d0-33c0-4340-bfe0-67872e19c79e-serviceca\") pod \"node-ca-flzqj\" (UID: \"8f5835d0-33c0-4340-bfe0-67872e19c79e\") " pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957038 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-run-openvswitch\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957077 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-run-ovn-kubernetes\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957111 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957152 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2aa70b5d-a245-4f50-adbe-ce8e71716842-agent-certs\") pod \"konnectivity-agent-wx7v9\" (UID: \"2aa70b5d-a245-4f50-adbe-ce8e71716842\") " pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:15:47.957336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957176 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-slash\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957209 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-run-ovn\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957231 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-log-socket\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957252 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-cni-bin\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957277 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhwtd\" (UniqueName: \"kubernetes.io/projected/20f418c6-9af5-4d97-ac8d-065d25b5b429-kube-api-access-mhwtd\") pod \"iptables-alerter-qrkcn\" (UID: \"20f418c6-9af5-4d97-ac8d-065d25b5b429\") " pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957299 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-node-log\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957301 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8617564-8813-486b-aaeb-9fd4ef61ca2f-tmp-dir\") pod \"node-resolver-tc6g6\" (UID: \"a8617564-8813-486b-aaeb-9fd4ef61ca2f\") " pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957340 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd7tx\" (UniqueName: \"kubernetes.io/projected/8f5835d0-33c0-4340-bfe0-67872e19c79e-kube-api-access-fd7tx\") pod \"node-ca-flzqj\" (UID: \"8f5835d0-33c0-4340-bfe0-67872e19c79e\") " pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957371 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957421 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-run-k8s-cni-cncf-io\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957444 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957466 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957488 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c1d5671-39e8-4826-af5d-f49631e0ece2-env-overrides\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957521 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-run\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957544 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-sys\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957636 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-run\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957638 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2r9\" (UniqueName: \"kubernetes.io/projected/cff831a4-3dde-4185-b6ac-264f7592353a-kube-api-access-vh2r9\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.958122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957662 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cff831a4-3dde-4185-b6ac-264f7592353a-sys\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957708 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-run-netns\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957741 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-etc-openvswitch\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957837 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c1d5671-39e8-4826-af5d-f49631e0ece2-ovnkube-script-lib\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957883 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdddc\" (UniqueName: \"kubernetes.io/projected/4c1d5671-39e8-4826-af5d-f49631e0ece2-kube-api-access-sdddc\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.957935 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f5835d0-33c0-4340-bfe0-67872e19c79e-host\") pod \"node-ca-flzqj\" (UID: \"8f5835d0-33c0-4340-bfe0-67872e19c79e\") " pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.958034 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f5835d0-33c0-4340-bfe0-67872e19c79e-host\") pod \"node-ca-flzqj\" (UID: \"8f5835d0-33c0-4340-bfe0-67872e19c79e\") " pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.958058 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-cni-netd\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.958112 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a8617564-8813-486b-aaeb-9fd4ef61ca2f-hosts-file\") pod \"node-resolver-tc6g6\" (UID: \"a8617564-8813-486b-aaeb-9fd4ef61ca2f\") " pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.958138 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-var-lib-cni-bin\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.958164 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-run-multus-certs\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:47.958904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.958180 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a8617564-8813-486b-aaeb-9fd4ef61ca2f-hosts-file\") pod \"node-resolver-tc6g6\" (UID: \"a8617564-8813-486b-aaeb-9fd4ef61ca2f\") " pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.959282 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.959158 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cff831a4-3dde-4185-b6ac-264f7592353a-etc-tuned\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.959282 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.959169 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cff831a4-3dde-4185-b6ac-264f7592353a-tmp\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.962720 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:47.962683 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:47.962720 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:47.962722 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:47.962852 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:47.962734 2581 projected.go:194] Error preparing data for projected volume kube-api-access-v2wmr for pod openshift-network-diagnostics/network-check-target-w6vsz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:47.962852 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:47.962835 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr podName:0d188a02-fe28-4c44-96ea-c22a4f133693 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:48.462787169 +0000 UTC m=+2.071926129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v2wmr" (UniqueName: "kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr") pod "network-check-target-w6vsz" (UID: "0d188a02-fe28-4c44-96ea-c22a4f133693") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:47.963905 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.963883 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knm9m\" (UniqueName: \"kubernetes.io/projected/a8617564-8813-486b-aaeb-9fd4ef61ca2f-kube-api-access-knm9m\") pod \"node-resolver-tc6g6\" (UID: \"a8617564-8813-486b-aaeb-9fd4ef61ca2f\") " pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:47.964863 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.964841 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2r9\" (UniqueName: \"kubernetes.io/projected/cff831a4-3dde-4185-b6ac-264f7592353a-kube-api-access-vh2r9\") pod \"tuned-stjmc\" (UID: \"cff831a4-3dde-4185-b6ac-264f7592353a\") " pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:47.964960 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.964941 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhwtd\" (UniqueName: \"kubernetes.io/projected/20f418c6-9af5-4d97-ac8d-065d25b5b429-kube-api-access-mhwtd\") pod \"iptables-alerter-qrkcn\" (UID: \"20f418c6-9af5-4d97-ac8d-065d25b5b429\") " pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:47.964999 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:47.964944 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd7tx\" (UniqueName: \"kubernetes.io/projected/8f5835d0-33c0-4340-bfe0-67872e19c79e-kube-api-access-fd7tx\") pod \"node-ca-flzqj\" (UID: \"8f5835d0-33c0-4340-bfe0-67872e19c79e\") " pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:47.991416 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:47.991378 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaec555e5e2b442b2cad3d99698ce3db.slice/crio-7fbc3de6b31cf8bf4e994c07dcdc1958623c251c41de8a6df2301e8886b2e812 WatchSource:0}: Error finding container 7fbc3de6b31cf8bf4e994c07dcdc1958623c251c41de8a6df2301e8886b2e812: Status 404 returned error can't find the container with id 7fbc3de6b31cf8bf4e994c07dcdc1958623c251c41de8a6df2301e8886b2e812 Apr 24 21:15:48.058861 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058783 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-run-ovn-kubernetes\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.058861 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058820 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2aa70b5d-a245-4f50-adbe-ce8e71716842-agent-certs\") pod \"konnectivity-agent-wx7v9\" (UID: \"2aa70b5d-a245-4f50-adbe-ce8e71716842\") " pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:15:48.058861 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058835 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-slash\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058881 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-slash\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058880 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-run-ovn-kubernetes\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058908 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-run-ovn\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058926 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-log-socket\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058940 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-cni-bin\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058955 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-node-log\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058977 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059004 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-run-ovn\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059005 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-run-k8s-cni-cncf-io\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059036 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-node-log\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059047 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059015 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-cni-bin\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059074 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.058980 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-log-socket\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059090 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059101 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-run-k8s-cni-cncf-io\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059119 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c1d5671-39e8-4826-af5d-f49631e0ece2-env-overrides\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:48.059139 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059146 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-run-netns\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059158 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059173 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-etc-openvswitch\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059152 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:48.059200 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs podName:ab61ba5a-75b0-4d88-af4b-3e98166b3f50 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:48.559182096 +0000 UTC m=+2.168321070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs") pod "network-metrics-daemon-mf254" (UID: "ab61ba5a-75b0-4d88-af4b-3e98166b3f50") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059209 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-etc-openvswitch\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059211 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-run-netns\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059236 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c1d5671-39e8-4826-af5d-f49631e0ece2-ovnkube-script-lib\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059279 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdddc\" (UniqueName: \"kubernetes.io/projected/4c1d5671-39e8-4826-af5d-f49631e0ece2-kube-api-access-sdddc\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059328 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-cni-netd\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059382 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-cni-netd\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059427 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-var-lib-cni-bin\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.059630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059461 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-run-multus-certs\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059490 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-run-systemd\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059515 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-var-lib-cni-bin\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059535 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c1d5671-39e8-4826-af5d-f49631e0ece2-ovnkube-config\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059552 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-run-multus-certs\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059565 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-system-cni-dir\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059577 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-run-systemd\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059590 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-multus-cni-dir\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059621 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-system-cni-dir\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059638 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c1d5671-39e8-4826-af5d-f49631e0ece2-env-overrides\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059650 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a6bc573-bfae-4ef4-a14b-3d3958d53365-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059640 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-system-cni-dir\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059698 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-multus-cni-dir\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059698 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-os-release\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059711 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-system-cni-dir\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059733 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw6rp\" (UniqueName: \"kubernetes.io/projected/2a6bc573-bfae-4ef4-a14b-3d3958d53365-kube-api-access-qw6rp\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059749 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-os-release\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.060449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059762 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-cnibin\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059787 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-multus-socket-dir-parent\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059809 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/85b37092-0856-40bf-ad2e-32b72caa332b-multus-daemon-config\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059832 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-etc-kubernetes\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059862 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-cnibin\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059865 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-multus-socket-dir-parent\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85b37092-0856-40bf-ad2e-32b72caa332b-cni-binary-copy\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059917 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-sys-fs\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059920 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-etc-kubernetes\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059921 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c1d5671-39e8-4826-af5d-f49631e0ece2-ovnkube-script-lib\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059940 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcl2g\" (UniqueName: \"kubernetes.io/projected/50116296-9c0c-4212-b36a-62c335f13209-kube-api-access-dcl2g\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059967 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-systemd-units\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059986 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-sys-fs\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.059996 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060025 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a6bc573-bfae-4ef4-a14b-3d3958d53365-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060027 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-systemd-units\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060061 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.061108 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060061 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c1d5671-39e8-4826-af5d-f49631e0ece2-ovnkube-config\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060089 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-registration-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060122 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-etc-selinux\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060131 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-registration-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060150 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-os-release\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060174 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-device-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060183 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-etc-selinux\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060197 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-cnibin\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060223 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-var-lib-cni-multus\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060227 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-os-release\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060249 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2aa70b5d-a245-4f50-adbe-ce8e71716842-konnectivity-ca\") pod \"konnectivity-agent-wx7v9\" (UID: \"2aa70b5d-a245-4f50-adbe-ce8e71716842\") " pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060231 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-device-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060262 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a6bc573-bfae-4ef4-a14b-3d3958d53365-cnibin\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060279 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a6bc573-bfae-4ef4-a14b-3d3958d53365-cni-binary-copy\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060286 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-var-lib-cni-multus\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060304 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-run-netns\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060331 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c1d5671-39e8-4826-af5d-f49631e0ece2-ovn-node-metrics-cert\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.061636 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060360 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-hostroot\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060384 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-multus-conf-dir\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060433 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqtnc\" (UniqueName: \"kubernetes.io/projected/85b37092-0856-40bf-ad2e-32b72caa332b-kube-api-access-hqtnc\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060461 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-socket-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060475 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/85b37092-0856-40bf-ad2e-32b72caa332b-multus-daemon-config\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060483 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-var-lib-kubelet\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060509 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ffw5\" (UniqueName: \"kubernetes.io/projected/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-kube-api-access-8ffw5\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060526 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85b37092-0856-40bf-ad2e-32b72caa332b-cni-binary-copy\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060536 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-kubelet\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060545 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-multus-conf-dir\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060561 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-var-lib-openvswitch\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060591 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-hostroot\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060584 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-run-openvswitch\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060638 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-run-openvswitch\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060673 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-run-netns\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060815 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a6bc573-bfae-4ef4-a14b-3d3958d53365-cni-binary-copy\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060862 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2aa70b5d-a245-4f50-adbe-ce8e71716842-konnectivity-ca\") pod \"konnectivity-agent-wx7v9\" (UID: \"2aa70b5d-a245-4f50-adbe-ce8e71716842\") " pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060878 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-host-kubelet\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.062135 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060893 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85b37092-0856-40bf-ad2e-32b72caa332b-host-var-lib-kubelet\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.062673 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060902 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/50116296-9c0c-4212-b36a-62c335f13209-socket-dir\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.062673 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060910 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a6bc573-bfae-4ef4-a14b-3d3958d53365-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.062673 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.060916 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c1d5671-39e8-4826-af5d-f49631e0ece2-var-lib-openvswitch\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.062673 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.061147 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a6bc573-bfae-4ef4-a14b-3d3958d53365-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.062673 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.061520 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2aa70b5d-a245-4f50-adbe-ce8e71716842-agent-certs\") pod \"konnectivity-agent-wx7v9\" (UID: \"2aa70b5d-a245-4f50-adbe-ce8e71716842\") " pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:15:48.062673 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.062560 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c1d5671-39e8-4826-af5d-f49631e0ece2-ovn-node-metrics-cert\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.067380 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.067358 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw6rp\" (UniqueName: \"kubernetes.io/projected/2a6bc573-bfae-4ef4-a14b-3d3958d53365-kube-api-access-qw6rp\") pod \"multus-additional-cni-plugins-qmnsd\" (UID: \"2a6bc573-bfae-4ef4-a14b-3d3958d53365\") " pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.067696 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.067677 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdddc\" (UniqueName: \"kubernetes.io/projected/4c1d5671-39e8-4826-af5d-f49631e0ece2-kube-api-access-sdddc\") pod \"ovnkube-node-dzbzn\" (UID: \"4c1d5671-39e8-4826-af5d-f49631e0ece2\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.067793 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.067772 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcl2g\" (UniqueName: \"kubernetes.io/projected/50116296-9c0c-4212-b36a-62c335f13209-kube-api-access-dcl2g\") pod \"aws-ebs-csi-driver-node-t4xf2\" (UID: \"50116296-9c0c-4212-b36a-62c335f13209\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.068068 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.068054 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ffw5\" (UniqueName: \"kubernetes.io/projected/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-kube-api-access-8ffw5\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:48.068466 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.068447 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqtnc\" (UniqueName: \"kubernetes.io/projected/85b37092-0856-40bf-ad2e-32b72caa332b-kube-api-access-hqtnc\") pod \"multus-n8xkn\" (UID: \"85b37092-0856-40bf-ad2e-32b72caa332b\") " pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.182684 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.182655 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-stjmc" Apr 24 21:15:48.188750 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:48.188728 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff831a4_3dde_4185_b6ac_264f7592353a.slice/crio-8529679617f61ffc290999222d20ef6ffeaf2bdcd80e8b70c2a1e2cbf03930f7 WatchSource:0}: Error finding container 8529679617f61ffc290999222d20ef6ffeaf2bdcd80e8b70c2a1e2cbf03930f7: Status 404 returned error can't find the container with id 8529679617f61ffc290999222d20ef6ffeaf2bdcd80e8b70c2a1e2cbf03930f7 Apr 24 21:15:48.195850 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.195833 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tc6g6" Apr 24 21:15:48.202321 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:48.202290 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8617564_8813_486b_aaeb_9fd4ef61ca2f.slice/crio-b23e5e197b7a48cf763a67a117451cd6978c443b2248ee812f13d820b2054257 WatchSource:0}: Error finding container b23e5e197b7a48cf763a67a117451cd6978c443b2248ee812f13d820b2054257: Status 404 returned error can't find the container with id b23e5e197b7a48cf763a67a117451cd6978c443b2248ee812f13d820b2054257 Apr 24 21:15:48.202321 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.202312 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-flzqj" Apr 24 21:15:48.207778 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:48.207755 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f5835d0_33c0_4340_bfe0_67872e19c79e.slice/crio-3227278b2eedd3983c358b04ab931fa5e39473b3749925f093082078345dff38 WatchSource:0}: Error finding container 3227278b2eedd3983c358b04ab931fa5e39473b3749925f093082078345dff38: Status 404 returned error can't find the container with id 3227278b2eedd3983c358b04ab931fa5e39473b3749925f093082078345dff38 Apr 24 21:15:48.216679 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.216661 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qrkcn" Apr 24 21:15:48.222356 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:48.222331 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f418c6_9af5_4d97_ac8d_065d25b5b429.slice/crio-2a581da93c4c9bdcf80ae9eb52b42bfcd11984115df3eddbda51bccf7a8e3e1c WatchSource:0}: Error finding container 2a581da93c4c9bdcf80ae9eb52b42bfcd11984115df3eddbda51bccf7a8e3e1c: Status 404 returned error can't find the container with id 2a581da93c4c9bdcf80ae9eb52b42bfcd11984115df3eddbda51bccf7a8e3e1c Apr 24 21:15:48.234374 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.234361 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:15:48.239868 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:48.239849 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c1d5671_39e8_4826_af5d_f49631e0ece2.slice/crio-dad1fb7a4ccfb365081e4810be219de0972a72910f9cb8929f0e530a2293f697 WatchSource:0}: Error finding container dad1fb7a4ccfb365081e4810be219de0972a72910f9cb8929f0e530a2293f697: Status 404 returned error can't find the container with id dad1fb7a4ccfb365081e4810be219de0972a72910f9cb8929f0e530a2293f697 Apr 24 21:15:48.241543 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.241525 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:15:48.247595 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.247576 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" Apr 24 21:15:48.247709 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:48.247692 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa70b5d_a245_4f50_adbe_ce8e71716842.slice/crio-36fd4581f6134598051b4327c9c66e67b46fd73c2558dfb450cdc8c235865bd8 WatchSource:0}: Error finding container 36fd4581f6134598051b4327c9c66e67b46fd73c2558dfb450cdc8c235865bd8: Status 404 returned error can't find the container with id 36fd4581f6134598051b4327c9c66e67b46fd73c2558dfb450cdc8c235865bd8 Apr 24 21:15:48.254313 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:48.254291 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50116296_9c0c_4212_b36a_62c335f13209.slice/crio-a311a2aec5ee9d0dfdc38ffa4590467fa02fb99aa9010ddcb89346ad57dd8512 WatchSource:0}: Error finding container a311a2aec5ee9d0dfdc38ffa4590467fa02fb99aa9010ddcb89346ad57dd8512: Status 404 returned error can't find the container with id a311a2aec5ee9d0dfdc38ffa4590467fa02fb99aa9010ddcb89346ad57dd8512 Apr 24 21:15:48.256132 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.256117 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" Apr 24 21:15:48.262198 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.262056 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n8xkn" Apr 24 21:15:48.263411 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:48.263379 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a6bc573_bfae_4ef4_a14b_3d3958d53365.slice/crio-eb915aabe095cc0bd87251ebaada50398c38e488a96a3c20ecb2576ce7711d49 WatchSource:0}: Error finding container eb915aabe095cc0bd87251ebaada50398c38e488a96a3c20ecb2576ce7711d49: Status 404 returned error can't find the container with id eb915aabe095cc0bd87251ebaada50398c38e488a96a3c20ecb2576ce7711d49 Apr 24 21:15:48.267588 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:15:48.267569 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85b37092_0856_40bf_ad2e_32b72caa332b.slice/crio-be6931277b1e41604310e83d7336d66fab0b55d6f21609db877dfe5a421d0321 WatchSource:0}: Error finding container be6931277b1e41604310e83d7336d66fab0b55d6f21609db877dfe5a421d0321: Status 404 returned error can't find the container with id be6931277b1e41604310e83d7336d66fab0b55d6f21609db877dfe5a421d0321 Apr 24 21:15:48.464205 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.463345 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:48.464205 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:48.463540 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:48.464205 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:48.463559 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:48.464205 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:48.463571 2581 projected.go:194] Error preparing data for projected volume kube-api-access-v2wmr for pod openshift-network-diagnostics/network-check-target-w6vsz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:48.464205 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:48.463630 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr podName:0d188a02-fe28-4c44-96ea-c22a4f133693 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:49.463612046 +0000 UTC m=+3.072751009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2wmr" (UniqueName: "kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr") pod "network-check-target-w6vsz" (UID: "0d188a02-fe28-4c44-96ea-c22a4f133693") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:48.568960 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.564434 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:48.568960 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:48.564634 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:48.568960 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:48.564696 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs podName:ab61ba5a-75b0-4d88-af4b-3e98166b3f50 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:49.564677675 +0000 UTC m=+3.173816648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs") pod "network-metrics-daemon-mf254" (UID: "ab61ba5a-75b0-4d88-af4b-3e98166b3f50") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:48.568960 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.565177 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:48.852090 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.852061 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:48.854271 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.854252 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:48.886742 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.886675 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:10:47 +0000 UTC" deadline="2027-10-17 16:19:43.361926657 +0000 UTC" Apr 24 21:15:48.886742 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.886705 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12979h3m54.475224459s" Apr 24 21:15:48.965723 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.965682 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" event={"ID":"2a6bc573-bfae-4ef4-a14b-3d3958d53365","Type":"ContainerStarted","Data":"eb915aabe095cc0bd87251ebaada50398c38e488a96a3c20ecb2576ce7711d49"} Apr 24 21:15:48.968152 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.968114 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wx7v9" event={"ID":"2aa70b5d-a245-4f50-adbe-ce8e71716842","Type":"ContainerStarted","Data":"36fd4581f6134598051b4327c9c66e67b46fd73c2558dfb450cdc8c235865bd8"} Apr 24 21:15:48.979128 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:48.979059 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tc6g6" event={"ID":"a8617564-8813-486b-aaeb-9fd4ef61ca2f","Type":"ContainerStarted","Data":"b23e5e197b7a48cf763a67a117451cd6978c443b2248ee812f13d820b2054257"} Apr 24 21:15:49.000179 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.000143 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" event={"ID":"baec555e5e2b442b2cad3d99698ce3db","Type":"ContainerStarted","Data":"7fbc3de6b31cf8bf4e994c07dcdc1958623c251c41de8a6df2301e8886b2e812"} Apr 24 21:15:49.004498 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.004468 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" event={"ID":"50116296-9c0c-4212-b36a-62c335f13209","Type":"ContainerStarted","Data":"a311a2aec5ee9d0dfdc38ffa4590467fa02fb99aa9010ddcb89346ad57dd8512"} Apr 24 21:15:49.013822 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.013795 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerStarted","Data":"dad1fb7a4ccfb365081e4810be219de0972a72910f9cb8929f0e530a2293f697"} Apr 24 21:15:49.030711 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.030676 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qrkcn" event={"ID":"20f418c6-9af5-4d97-ac8d-065d25b5b429","Type":"ContainerStarted","Data":"2a581da93c4c9bdcf80ae9eb52b42bfcd11984115df3eddbda51bccf7a8e3e1c"} Apr 24 21:15:49.045352 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.045319 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flzqj" event={"ID":"8f5835d0-33c0-4340-bfe0-67872e19c79e","Type":"ContainerStarted","Data":"3227278b2eedd3983c358b04ab931fa5e39473b3749925f093082078345dff38"} Apr 24 21:15:49.048645 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.048615 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-stjmc" event={"ID":"cff831a4-3dde-4185-b6ac-264f7592353a","Type":"ContainerStarted","Data":"8529679617f61ffc290999222d20ef6ffeaf2bdcd80e8b70c2a1e2cbf03930f7"} Apr 24 21:15:49.056644 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.056614 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8xkn" event={"ID":"85b37092-0856-40bf-ad2e-32b72caa332b","Type":"ContainerStarted","Data":"be6931277b1e41604310e83d7336d66fab0b55d6f21609db877dfe5a421d0321"} Apr 24 21:15:49.472932 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.472897 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:49.473106 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:49.473048 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:49.473106 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:49.473068 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:49.473106 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:49.473081 2581 projected.go:194] Error preparing data for projected volume kube-api-access-v2wmr for pod openshift-network-diagnostics/network-check-target-w6vsz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:49.473273 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:49.473138 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr podName:0d188a02-fe28-4c44-96ea-c22a4f133693 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:51.473119519 +0000 UTC m=+5.082258483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2wmr" (UniqueName: "kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr") pod "network-check-target-w6vsz" (UID: "0d188a02-fe28-4c44-96ea-c22a4f133693") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:49.574007 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.573969 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:49.574194 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:49.574126 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:49.574194 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:49.574191 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs podName:ab61ba5a-75b0-4d88-af4b-3e98166b3f50 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:51.574172785 +0000 UTC m=+5.183311763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs") pod "network-metrics-daemon-mf254" (UID: "ab61ba5a-75b0-4d88-af4b-3e98166b3f50") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:49.887082 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.887031 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:10:47 +0000 UTC" deadline="2027-12-13 23:13:41.046955149 +0000 UTC" Apr 24 21:15:49.887082 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.887081 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14353h57m51.159877661s" Apr 24 21:15:49.946046 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.946017 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:49.946221 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:49.946150 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:15:49.946511 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:49.946466 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:49.946623 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:49.946558 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:15:51.491597 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:51.491554 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:51.492067 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:51.491742 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:51.492067 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:51.491770 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:51.492067 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:51.491783 2581 projected.go:194] Error preparing data for projected volume kube-api-access-v2wmr for pod openshift-network-diagnostics/network-check-target-w6vsz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:51.492067 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:51.491844 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr podName:0d188a02-fe28-4c44-96ea-c22a4f133693 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:55.49182386 +0000 UTC m=+9.100962823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2wmr" (UniqueName: "kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr") pod "network-check-target-w6vsz" (UID: "0d188a02-fe28-4c44-96ea-c22a4f133693") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:51.592732 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:51.592638 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:51.592920 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:51.592807 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:51.592920 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:51.592893 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs podName:ab61ba5a-75b0-4d88-af4b-3e98166b3f50 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:55.592869595 +0000 UTC m=+9.202008552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs") pod "network-metrics-daemon-mf254" (UID: "ab61ba5a-75b0-4d88-af4b-3e98166b3f50") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:51.946128 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:51.946095 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:51.946315 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:51.946221 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:15:51.946447 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:51.946340 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:51.946519 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:51.946475 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:15:53.946597 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:53.945907 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:53.946597 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:53.946038 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:15:53.946597 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:53.946508 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:53.947153 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:53.946623 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:15:55.527131 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:55.527084 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:55.527573 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:55.527244 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:55.527573 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:55.527269 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:55.527573 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:55.527282 2581 projected.go:194] Error preparing data for projected volume kube-api-access-v2wmr for pod openshift-network-diagnostics/network-check-target-w6vsz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:55.527573 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:55.527346 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr podName:0d188a02-fe28-4c44-96ea-c22a4f133693 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:03.527327037 +0000 UTC m=+17.136466001 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2wmr" (UniqueName: "kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr") pod "network-check-target-w6vsz" (UID: "0d188a02-fe28-4c44-96ea-c22a4f133693") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:55.628360 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:55.628268 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:55.628567 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:55.628375 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:55.628567 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:55.628464 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs podName:ab61ba5a-75b0-4d88-af4b-3e98166b3f50 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:03.628443219 +0000 UTC m=+17.237582178 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs") pod "network-metrics-daemon-mf254" (UID: "ab61ba5a-75b0-4d88-af4b-3e98166b3f50") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:55.946103 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:55.946040 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:55.946290 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:55.946053 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:55.946290 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:55.946194 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:15:55.946522 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:55.946310 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:15:57.946296 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:57.946262 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:57.946782 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:57.946262 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:57.946782 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:57.946404 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:15:57.946782 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:57.946477 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:15:59.945408 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:59.945371 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:15:59.945832 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:15:59.945371 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:15:59.945832 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:59.945484 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:15:59.945832 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:15:59.945620 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:01.945969 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:01.945931 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:01.946468 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:01.945948 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:01.946468 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:01.946061 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:01.946468 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:01.946181 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:03.585857 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:03.585823 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:03.586483 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:03.585973 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:03.586483 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:03.585993 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:03.586483 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:03.586007 2581 projected.go:194] Error preparing data for projected volume kube-api-access-v2wmr for pod openshift-network-diagnostics/network-check-target-w6vsz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:03.586483 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:03.586077 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr podName:0d188a02-fe28-4c44-96ea-c22a4f133693 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:19.586058776 +0000 UTC m=+33.195197752 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2wmr" (UniqueName: "kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr") pod "network-check-target-w6vsz" (UID: "0d188a02-fe28-4c44-96ea-c22a4f133693") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:03.687201 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:03.687165 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:03.687383 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:03.687333 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:03.687477 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:03.687427 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs podName:ab61ba5a-75b0-4d88-af4b-3e98166b3f50 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:19.687408555 +0000 UTC m=+33.296547515 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs") pod "network-metrics-daemon-mf254" (UID: "ab61ba5a-75b0-4d88-af4b-3e98166b3f50") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:03.945554 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:03.945525 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:03.945554 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:03.945542 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:03.945768 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:03.945630 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:03.945768 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:03.945756 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:05.945552 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:05.945518 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:05.945552 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:05.945540 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:05.945956 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:05.945619 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:05.945956 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:05.945759 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:07.099342 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.099035 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" event={"ID":"baec555e5e2b442b2cad3d99698ce3db","Type":"ContainerStarted","Data":"a8ed677b6caf493d3c14e0d80fa2c15f9eb17364cb4bd10f4130f7ca903cdeaf"} Apr 24 21:16:07.106467 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.106440 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:16:07.106764 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.106738 2581 generic.go:358] "Generic (PLEG): container finished" podID="4c1d5671-39e8-4826-af5d-f49631e0ece2" containerID="ca1be5436c4b1007b1968fe3b0c8a620f89d4de00652e86d3d4e6e770357892e" exitCode=1 Apr 24 21:16:07.106881 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.106807 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerStarted","Data":"40a118c637dbd2868b39ad8101b5ce24d023c5bbc5e110c82e9c9f806ffcfc07"} Apr 24 21:16:07.106881 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.106846 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerStarted","Data":"b4ca0d2e54a85af94e67de39eafe9f237278a2df486b13a0640c42a4db4dd847"} Apr 24 21:16:07.106881 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.106862 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerStarted","Data":"7f05729d675fcfa558752bb1adc7034d9092bcc694072754285a549cec967d97"} Apr 24 21:16:07.106881 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.106873 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerStarted","Data":"66aa96495072994029c35a9af87720c78e96b5543b4f1d4578ce02695746a1cc"} Apr 24 21:16:07.107082 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.106885 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerDied","Data":"ca1be5436c4b1007b1968fe3b0c8a620f89d4de00652e86d3d4e6e770357892e"} Apr 24 21:16:07.107082 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.106899 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerStarted","Data":"ea2fa3ef2790c489797f2caf7cb468acfd28837e43d880089b4223530073295d"} Apr 24 21:16:07.108766 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.108528 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-stjmc" event={"ID":"cff831a4-3dde-4185-b6ac-264f7592353a","Type":"ContainerStarted","Data":"9130eb33865e7f9117ae0d2fb2d2ed1a73735dd7a2d0992f307c1e663366d76f"} Apr 24 21:16:07.112302 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.112223 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8xkn" event={"ID":"85b37092-0856-40bf-ad2e-32b72caa332b","Type":"ContainerStarted","Data":"30a3068da489a21dd3dd2c0a4bffa6f2c207c0f2948e27333b34ede39da3e3cd"} Apr 24 21:16:07.138457 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.138241 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" podStartSLOduration=20.138222125 podStartE2EDuration="20.138222125s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:07.115230147 +0000 UTC m=+20.724369127" watchObservedRunningTime="2026-04-24 21:16:07.138222125 +0000 UTC m=+20.747361105" Apr 24 21:16:07.138719 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.138691 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n8xkn" podStartSLOduration=1.944340169 podStartE2EDuration="20.138681944s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:15:48.271255808 +0000 UTC m=+1.880394765" lastFinishedPulling="2026-04-24 21:16:06.465597569 +0000 UTC m=+20.074736540" observedRunningTime="2026-04-24 21:16:07.137893618 +0000 UTC m=+20.747032598" watchObservedRunningTime="2026-04-24 21:16:07.138681944 +0000 UTC m=+20.747820922" Apr 24 21:16:07.155412 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.155276 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-stjmc" podStartSLOduration=2.170202449 podStartE2EDuration="20.15526028s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:15:48.190251507 +0000 UTC m=+1.799390464" lastFinishedPulling="2026-04-24 21:16:06.175309323 +0000 UTC m=+19.784448295" observedRunningTime="2026-04-24 21:16:07.155155405 +0000 UTC m=+20.764294383" watchObservedRunningTime="2026-04-24 21:16:07.15526028 +0000 UTC m=+20.764399262" Apr 24 21:16:07.945587 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.945556 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:07.945587 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:07.945586 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:07.945789 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:07.945661 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:07.945789 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:07.945757 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:08.008186 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.008163 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:16:08.115604 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.115574 2581 generic.go:358] "Generic (PLEG): container finished" podID="906f6cca8711ebeed3b778a79317b11c" containerID="bd2935d1f0572a9d023cc178eade3355c9867cbb08bb5b571ec95a5df2edc1c9" exitCode=0 Apr 24 21:16:08.116018 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.115655 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" event={"ID":"906f6cca8711ebeed3b778a79317b11c","Type":"ContainerDied","Data":"bd2935d1f0572a9d023cc178eade3355c9867cbb08bb5b571ec95a5df2edc1c9"} Apr 24 21:16:08.117250 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.117224 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" event={"ID":"50116296-9c0c-4212-b36a-62c335f13209","Type":"ContainerStarted","Data":"668fddb04fbf87ec637e5f10f2269aed8b5783144024ff9961a25bedffaaa419"} Apr 24 21:16:08.117250 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.117252 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" event={"ID":"50116296-9c0c-4212-b36a-62c335f13209","Type":"ContainerStarted","Data":"f9142d20f33dbda02e207e0aa806b09d0438d912a765831f264bd62aac6a1487"} Apr 24 21:16:08.118406 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.118362 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qrkcn" event={"ID":"20f418c6-9af5-4d97-ac8d-065d25b5b429","Type":"ContainerStarted","Data":"3448ed611ae85379fbe16f4687c76976bb1c7360b45e3162e0d0b3a836ab1770"} Apr 24 21:16:08.119655 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.119631 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-flzqj" event={"ID":"8f5835d0-33c0-4340-bfe0-67872e19c79e","Type":"ContainerStarted","Data":"b127b2bf43d1a4d7f2da49c3f099e2f0d07362939afc8f35823d3afc549cd64e"} Apr 24 21:16:08.120901 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.120883 2581 generic.go:358] "Generic (PLEG): container finished" podID="2a6bc573-bfae-4ef4-a14b-3d3958d53365" containerID="24b9328d994701ff065676b9073396bbbec38e5a80b0f87f6f03dccf0c4ff51b" exitCode=0 Apr 24 21:16:08.120970 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.120939 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" event={"ID":"2a6bc573-bfae-4ef4-a14b-3d3958d53365","Type":"ContainerDied","Data":"24b9328d994701ff065676b9073396bbbec38e5a80b0f87f6f03dccf0c4ff51b"} Apr 24 21:16:08.122112 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.122086 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wx7v9" event={"ID":"2aa70b5d-a245-4f50-adbe-ce8e71716842","Type":"ContainerStarted","Data":"d7deddc72d78ae671d3c778ffaaf8be6be147d0929074a5a6cd98f96148c4310"} Apr 24 21:16:08.123590 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.123385 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tc6g6" event={"ID":"a8617564-8813-486b-aaeb-9fd4ef61ca2f","Type":"ContainerStarted","Data":"320e82c2cba23c1a63b48443d5e4314b2590165dc4ecb12d2c0de8659a303f26"} Apr 24 21:16:08.176122 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.176011 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-flzqj" podStartSLOduration=3.212481968 podStartE2EDuration="21.17599376s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:15:48.209742836 +0000 UTC m=+1.818881792" lastFinishedPulling="2026-04-24 21:16:06.17325461 +0000 UTC m=+19.782393584" observedRunningTime="2026-04-24 21:16:08.175153841 +0000 UTC m=+21.784292820" watchObservedRunningTime="2026-04-24 21:16:08.17599376 +0000 UTC m=+21.785132739" Apr 24 21:16:08.189420 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.189368 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qrkcn" podStartSLOduration=3.239249919 podStartE2EDuration="21.189355223s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:15:48.22381611 +0000 UTC m=+1.832955066" lastFinishedPulling="2026-04-24 21:16:06.173921398 +0000 UTC m=+19.783060370" observedRunningTime="2026-04-24 21:16:08.18912075 +0000 UTC m=+21.798259740" watchObservedRunningTime="2026-04-24 21:16:08.189355223 +0000 UTC m=+21.798494203" Apr 24 21:16:08.221533 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.221475 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tc6g6" podStartSLOduration=3.251527034 podStartE2EDuration="21.221462366s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:15:48.20491164 +0000 UTC m=+1.814050602" lastFinishedPulling="2026-04-24 21:16:06.174846962 +0000 UTC m=+19.783985934" observedRunningTime="2026-04-24 21:16:08.204514062 +0000 UTC m=+21.813653041" watchObservedRunningTime="2026-04-24 21:16:08.221462366 +0000 UTC m=+21.830601345" Apr 24 21:16:08.221761 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.221741 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wx7v9" podStartSLOduration=3.298665271 podStartE2EDuration="21.221735563s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:15:48.250433464 +0000 UTC m=+1.859572422" lastFinishedPulling="2026-04-24 21:16:06.17350374 +0000 UTC m=+19.782642714" observedRunningTime="2026-04-24 21:16:08.22126469 +0000 UTC m=+21.830403669" watchObservedRunningTime="2026-04-24 21:16:08.221735563 +0000 UTC m=+21.830874543" Apr 24 21:16:08.923258 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.923032 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:16:08.008180799Z","UUID":"669ac6b0-18c0-4ee2-b05d-f0c84ad11b5a","Handler":null,"Name":"","Endpoint":""} Apr 24 21:16:08.926517 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.926103 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:16:08.926517 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:08.926133 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:16:09.127210 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:09.127113 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" event={"ID":"906f6cca8711ebeed3b778a79317b11c","Type":"ContainerStarted","Data":"bd4ca852cb71a68ff00a738eb5955080b84d4c026128d6498875e63b9b4bc280"} Apr 24 21:16:09.129374 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:09.129344 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" event={"ID":"50116296-9c0c-4212-b36a-62c335f13209","Type":"ContainerStarted","Data":"6f86b568a0a6d5f444406b069ab6546e14ec7df16a0cc97c72955e2cdfd5846b"} Apr 24 21:16:09.132317 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:09.132297 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:16:09.132767 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:09.132720 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerStarted","Data":"e02c22adbd4630dca42ea6fdf34451853dc4534aaa124b586592bccd3a6c7d82"} Apr 24 21:16:09.141682 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:09.141639 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" podStartSLOduration=22.141629011 podStartE2EDuration="22.141629011s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:09.141091458 +0000 UTC m=+22.750230438" watchObservedRunningTime="2026-04-24 21:16:09.141629011 +0000 UTC m=+22.750767989" Apr 24 21:16:09.161695 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:09.161650 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4xf2" podStartSLOduration=1.673815334 podStartE2EDuration="22.161636936s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:15:48.255879682 +0000 UTC m=+1.865018639" lastFinishedPulling="2026-04-24 21:16:08.743701281 +0000 UTC m=+22.352840241" observedRunningTime="2026-04-24 21:16:09.161335393 +0000 UTC m=+22.770474373" watchObservedRunningTime="2026-04-24 21:16:09.161636936 +0000 UTC m=+22.770775913" Apr 24 21:16:09.946259 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:09.946227 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:09.946520 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:09.946227 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:09.946520 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:09.946348 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:09.946520 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:09.946443 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:10.482870 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:10.482837 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:16:10.483560 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:10.483539 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:16:11.137151 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:11.137113 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:16:11.137639 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:11.137620 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wx7v9" Apr 24 21:16:11.945500 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:11.945465 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:11.946056 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:11.945465 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:11.946056 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:11.945607 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:11.946056 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:11.945661 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:13.142934 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.142750 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:16:13.143487 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.143263 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerStarted","Data":"147c73c863dd9589478bfe2f63fdbaec03f9bc840bd7c3afe2a76146d67809d8"} Apr 24 21:16:13.143575 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.143554 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:16:13.143627 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.143588 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:16:13.143796 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.143773 2581 scope.go:117] "RemoveContainer" containerID="ca1be5436c4b1007b1968fe3b0c8a620f89d4de00652e86d3d4e6e770357892e" Apr 24 21:16:13.145270 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.145246 2581 generic.go:358] "Generic (PLEG): container finished" podID="2a6bc573-bfae-4ef4-a14b-3d3958d53365" containerID="af0b978b730cbe2d911ea32c03de9d3f87d211f89e23ff68f763021e0b956ac0" exitCode=0 Apr 24 21:16:13.145368 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.145331 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" event={"ID":"2a6bc573-bfae-4ef4-a14b-3d3958d53365","Type":"ContainerDied","Data":"af0b978b730cbe2d911ea32c03de9d3f87d211f89e23ff68f763021e0b956ac0"} Apr 24 21:16:13.159344 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.159312 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:16:13.159557 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.159538 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:16:13.945336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.945301 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:13.945336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:13.945326 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:13.945548 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:13.945451 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:13.945633 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:13.945604 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:14.117384 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.117111 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w6vsz"] Apr 24 21:16:14.127562 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.127537 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mf254"] Apr 24 21:16:14.151084 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.151061 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:16:14.151506 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.151428 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" event={"ID":"4c1d5671-39e8-4826-af5d-f49631e0ece2","Type":"ContainerStarted","Data":"44d35e2f2019ef6596689402b51ca5df05e34f1891e62dc1e64a7af4366ca833"} Apr 24 21:16:14.151705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.151690 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:16:14.153454 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.153431 2581 generic.go:358] "Generic (PLEG): container finished" podID="2a6bc573-bfae-4ef4-a14b-3d3958d53365" containerID="dc0fe41f4e991dd0c7284e72151b3d3c36938244643b05611546d074c6243c50" exitCode=0 Apr 24 21:16:14.153557 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.153505 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:14.153557 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.153510 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" event={"ID":"2a6bc573-bfae-4ef4-a14b-3d3958d53365","Type":"ContainerDied","Data":"dc0fe41f4e991dd0c7284e72151b3d3c36938244643b05611546d074c6243c50"} Apr 24 21:16:14.153688 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:14.153660 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:14.153749 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.153679 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:14.153804 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:14.153782 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:14.177805 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:14.177762 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" podStartSLOduration=9.202812734 podStartE2EDuration="27.177749674s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:15:48.241444205 +0000 UTC m=+1.850583162" lastFinishedPulling="2026-04-24 21:16:06.216381144 +0000 UTC m=+19.825520102" observedRunningTime="2026-04-24 21:16:14.17731912 +0000 UTC m=+27.786458099" watchObservedRunningTime="2026-04-24 21:16:14.177749674 +0000 UTC m=+27.786888652" Apr 24 21:16:15.158204 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:15.158132 2581 generic.go:358] "Generic (PLEG): container finished" podID="2a6bc573-bfae-4ef4-a14b-3d3958d53365" containerID="27c9d75124863df088ea37e17958599c4b5b180229154137d33964cccbccd4ff" exitCode=0 Apr 24 21:16:15.158549 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:15.158210 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" event={"ID":"2a6bc573-bfae-4ef4-a14b-3d3958d53365","Type":"ContainerDied","Data":"27c9d75124863df088ea37e17958599c4b5b180229154137d33964cccbccd4ff"} Apr 24 21:16:15.945343 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:15.945304 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:15.945529 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:15.945304 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:15.945529 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:15.945453 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:15.945636 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:15.945523 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:17.087299 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.087271 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-svh9x"] Apr 24 21:16:17.094897 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.094864 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.095036 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:17.094954 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svh9x" podUID="64430b8d-991a-4176-a2b8-b3c3e21f20ba" Apr 24 21:16:17.099342 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.099317 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-svh9x"] Apr 24 21:16:17.161880 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.161853 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.162035 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:17.161994 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svh9x" podUID="64430b8d-991a-4176-a2b8-b3c3e21f20ba" Apr 24 21:16:17.185751 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.185707 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64430b8d-991a-4176-a2b8-b3c3e21f20ba-dbus\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.185751 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.185752 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64430b8d-991a-4176-a2b8-b3c3e21f20ba-kubelet-config\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.185977 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.185780 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.286169 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.286138 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64430b8d-991a-4176-a2b8-b3c3e21f20ba-kubelet-config\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.286169 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.286177 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.286415 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.286262 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64430b8d-991a-4176-a2b8-b3c3e21f20ba-kubelet-config\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.286415 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.286268 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64430b8d-991a-4176-a2b8-b3c3e21f20ba-dbus\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.286415 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:17.286378 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:17.286558 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:17.286480 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret podName:64430b8d-991a-4176-a2b8-b3c3e21f20ba nodeName:}" failed. No retries permitted until 2026-04-24 21:16:17.786460317 +0000 UTC m=+31.395599287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret") pod "global-pull-secret-syncer-svh9x" (UID: "64430b8d-991a-4176-a2b8-b3c3e21f20ba") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:17.286558 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.286502 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64430b8d-991a-4176-a2b8-b3c3e21f20ba-dbus\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.790171 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.790137 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:17.790352 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:17.790318 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:17.790422 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:17.790406 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret podName:64430b8d-991a-4176-a2b8-b3c3e21f20ba nodeName:}" failed. No retries permitted until 2026-04-24 21:16:18.790374879 +0000 UTC m=+32.399513851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret") pod "global-pull-secret-syncer-svh9x" (UID: "64430b8d-991a-4176-a2b8-b3c3e21f20ba") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:17.945625 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.945596 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:17.945814 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:17.945702 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6vsz" podUID="0d188a02-fe28-4c44-96ea-c22a4f133693" Apr 24 21:16:17.945814 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:17.945763 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:17.945933 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:17.945885 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf254" podUID="ab61ba5a-75b0-4d88-af4b-3e98166b3f50" Apr 24 21:16:18.799325 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:18.799292 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:18.799861 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:18.799475 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:18.799861 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:18.799550 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret podName:64430b8d-991a-4176-a2b8-b3c3e21f20ba nodeName:}" failed. No retries permitted until 2026-04-24 21:16:20.799530171 +0000 UTC m=+34.408669132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret") pod "global-pull-secret-syncer-svh9x" (UID: "64430b8d-991a-4176-a2b8-b3c3e21f20ba") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:18.945750 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:18.945712 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:18.945925 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:18.945840 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svh9x" podUID="64430b8d-991a-4176-a2b8-b3c3e21f20ba" Apr 24 21:16:19.170211 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.170185 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeReady" Apr 24 21:16:19.170386 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.170336 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:16:19.214421 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.214374 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p55cz"] Apr 24 21:16:19.245805 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.245769 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-blv55"] Apr 24 21:16:19.245952 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.245868 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.248406 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.248366 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:16:19.248539 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.248410 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-txjjd\"" Apr 24 21:16:19.248539 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.248427 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:16:19.278609 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.278578 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p55cz"] Apr 24 21:16:19.278609 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.278616 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-blv55"] Apr 24 21:16:19.278833 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.278696 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:19.282336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.281465 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:16:19.282336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.281803 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:16:19.282336 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.282076 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:16:19.282568 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.282403 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qvvhg\"" Apr 24 21:16:19.404870 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.404825 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.405045 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.404904 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsjxk\" (UniqueName: \"kubernetes.io/projected/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-kube-api-access-bsjxk\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.405045 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.404974 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:19.405045 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.405029 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk956\" (UniqueName: \"kubernetes.io/projected/83a219f3-ecef-475c-85b4-5e5f89df5b6f-kube-api-access-lk956\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:19.405175 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.405046 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-config-volume\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.405175 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.405060 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-tmp-dir\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.506163 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.506114 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk956\" (UniqueName: \"kubernetes.io/projected/83a219f3-ecef-475c-85b4-5e5f89df5b6f-kube-api-access-lk956\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:19.506348 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.506175 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-config-volume\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.506348 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.506200 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-tmp-dir\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.506348 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.506231 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.506348 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.506329 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:19.506598 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.506412 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls podName:593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:20.006370148 +0000 UTC m=+33.615509105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls") pod "dns-default-p55cz" (UID: "593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c") : secret "dns-default-metrics-tls" not found Apr 24 21:16:19.506598 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.506434 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsjxk\" (UniqueName: \"kubernetes.io/projected/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-kube-api-access-bsjxk\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.506598 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.506471 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:19.506598 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.506594 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:19.506759 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.506632 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert podName:83a219f3-ecef-475c-85b4-5e5f89df5b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:20.00662003 +0000 UTC m=+33.615759007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert") pod "ingress-canary-blv55" (UID: "83a219f3-ecef-475c-85b4-5e5f89df5b6f") : secret "canary-serving-cert" not found Apr 24 21:16:19.506833 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.506814 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-config-volume\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.516999 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.516816 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsjxk\" (UniqueName: \"kubernetes.io/projected/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-kube-api-access-bsjxk\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.517162 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.516924 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk956\" (UniqueName: \"kubernetes.io/projected/83a219f3-ecef-475c-85b4-5e5f89df5b6f-kube-api-access-lk956\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:19.521060 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.521031 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-tmp-dir\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:19.607250 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.607209 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:19.607480 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.607421 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:19.607480 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.607449 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:19.607480 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.607465 2581 projected.go:194] Error preparing data for projected volume kube-api-access-v2wmr for pod openshift-network-diagnostics/network-check-target-w6vsz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:19.607644 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.607550 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr podName:0d188a02-fe28-4c44-96ea-c22a4f133693 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.607527647 +0000 UTC m=+65.216666606 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-v2wmr" (UniqueName: "kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr") pod "network-check-target-w6vsz" (UID: "0d188a02-fe28-4c44-96ea-c22a4f133693") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:19.708471 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.708339 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:19.708643 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.708514 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:19.708643 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:19.708607 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs podName:ab61ba5a-75b0-4d88-af4b-3e98166b3f50 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.708586698 +0000 UTC m=+65.317725655 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs") pod "network-metrics-daemon-mf254" (UID: "ab61ba5a-75b0-4d88-af4b-3e98166b3f50") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:19.946068 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.946033 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:19.946632 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.946079 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:19.948632 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.948610 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:16:19.948767 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.948654 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:16:19.948767 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.948754 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2rwk\"" Apr 24 21:16:19.949440 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.949381 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:16:19.949586 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:19.949461 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m7v99\"" Apr 24 21:16:20.011565 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:20.011522 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:20.011757 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:20.011583 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:20.011757 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:20.011690 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:20.011867 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:20.011692 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:20.011867 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:20.011801 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls podName:593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:21.011778781 +0000 UTC m=+34.620917738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls") pod "dns-default-p55cz" (UID: "593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c") : secret "dns-default-metrics-tls" not found Apr 24 21:16:20.011867 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:20.011852 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert podName:83a219f3-ecef-475c-85b4-5e5f89df5b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:21.011829131 +0000 UTC m=+34.620968091 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert") pod "ingress-canary-blv55" (UID: "83a219f3-ecef-475c-85b4-5e5f89df5b6f") : secret "canary-serving-cert" not found Apr 24 21:16:20.817299 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:20.817267 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:20.817487 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:20.817433 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:20.817532 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:20.817491 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret podName:64430b8d-991a-4176-a2b8-b3c3e21f20ba nodeName:}" failed. No retries permitted until 2026-04-24 21:16:24.817478019 +0000 UTC m=+38.426616975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret") pod "global-pull-secret-syncer-svh9x" (UID: "64430b8d-991a-4176-a2b8-b3c3e21f20ba") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:20.948175 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:20.948151 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:20.950872 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:20.950857 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:16:21.019142 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:21.019113 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:21.019246 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:21.019162 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:21.019284 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:21.019258 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:21.019284 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:21.019274 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:21.019347 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:21.019315 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls podName:593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:23.019299333 +0000 UTC m=+36.628438290 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls") pod "dns-default-p55cz" (UID: "593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c") : secret "dns-default-metrics-tls" not found Apr 24 21:16:21.019347 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:21.019328 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert podName:83a219f3-ecef-475c-85b4-5e5f89df5b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:23.01932225 +0000 UTC m=+36.628461207 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert") pod "ingress-canary-blv55" (UID: "83a219f3-ecef-475c-85b4-5e5f89df5b6f") : secret "canary-serving-cert" not found Apr 24 21:16:21.172482 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:21.172445 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" event={"ID":"2a6bc573-bfae-4ef4-a14b-3d3958d53365","Type":"ContainerStarted","Data":"f08085ff15feccb43ad18fe2298a8ebc5d539999225c72c18424e6dfc99caada"} Apr 24 21:16:22.176862 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:22.176828 2581 generic.go:358] "Generic (PLEG): container finished" podID="2a6bc573-bfae-4ef4-a14b-3d3958d53365" containerID="f08085ff15feccb43ad18fe2298a8ebc5d539999225c72c18424e6dfc99caada" exitCode=0 Apr 24 21:16:22.177230 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:22.176882 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" event={"ID":"2a6bc573-bfae-4ef4-a14b-3d3958d53365","Type":"ContainerDied","Data":"f08085ff15feccb43ad18fe2298a8ebc5d539999225c72c18424e6dfc99caada"} Apr 24 21:16:23.035603 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:23.035565 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:23.035802 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:23.035647 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:23.035802 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:23.035738 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:23.035913 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:23.035815 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert podName:83a219f3-ecef-475c-85b4-5e5f89df5b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:27.035793004 +0000 UTC m=+40.644931974 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert") pod "ingress-canary-blv55" (UID: "83a219f3-ecef-475c-85b4-5e5f89df5b6f") : secret "canary-serving-cert" not found Apr 24 21:16:23.035913 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:23.035747 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:23.035913 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:23.035869 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls podName:593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:27.035857652 +0000 UTC m=+40.644996608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls") pod "dns-default-p55cz" (UID: "593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c") : secret "dns-default-metrics-tls" not found Apr 24 21:16:23.181839 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:23.181802 2581 generic.go:358] "Generic (PLEG): container finished" podID="2a6bc573-bfae-4ef4-a14b-3d3958d53365" containerID="4e408f3efc23e416b2b98bcbedd85e795cf28bb9dd04955808573b3dc7b228b3" exitCode=0 Apr 24 21:16:23.182214 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:23.181854 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" event={"ID":"2a6bc573-bfae-4ef4-a14b-3d3958d53365","Type":"ContainerDied","Data":"4e408f3efc23e416b2b98bcbedd85e795cf28bb9dd04955808573b3dc7b228b3"} Apr 24 21:16:24.186623 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:24.186590 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" event={"ID":"2a6bc573-bfae-4ef4-a14b-3d3958d53365","Type":"ContainerStarted","Data":"6152735159d39786cc838e8ba5ff5000fb94716da5193b26855d52b57fe854c0"} Apr 24 21:16:24.219763 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:24.219709 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qmnsd" podStartSLOduration=4.511332604 podStartE2EDuration="37.219693305s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:15:48.26499074 +0000 UTC m=+1.874129697" lastFinishedPulling="2026-04-24 21:16:20.973351442 +0000 UTC m=+34.582490398" observedRunningTime="2026-04-24 21:16:24.218540632 +0000 UTC m=+37.827679611" watchObservedRunningTime="2026-04-24 21:16:24.219693305 +0000 UTC m=+37.828832286" Apr 24 21:16:24.849692 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:24.849646 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:24.853370 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:24.853340 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64430b8d-991a-4176-a2b8-b3c3e21f20ba-original-pull-secret\") pod \"global-pull-secret-syncer-svh9x\" (UID: \"64430b8d-991a-4176-a2b8-b3c3e21f20ba\") " pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:24.856539 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:24.856514 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svh9x" Apr 24 21:16:25.027795 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:25.027760 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-svh9x"] Apr 24 21:16:25.031547 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:16:25.031515 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64430b8d_991a_4176_a2b8_b3c3e21f20ba.slice/crio-1b97984269362e475608acdf838af0a237bb4bfc3169aa966ab6b487fa1bcb74 WatchSource:0}: Error finding container 1b97984269362e475608acdf838af0a237bb4bfc3169aa966ab6b487fa1bcb74: Status 404 returned error can't find the container with id 1b97984269362e475608acdf838af0a237bb4bfc3169aa966ab6b487fa1bcb74 Apr 24 21:16:25.189362 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:25.189327 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-svh9x" event={"ID":"64430b8d-991a-4176-a2b8-b3c3e21f20ba","Type":"ContainerStarted","Data":"1b97984269362e475608acdf838af0a237bb4bfc3169aa966ab6b487fa1bcb74"} Apr 24 21:16:27.066125 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:27.066084 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:27.066563 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:27.066177 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:27.066563 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:27.066263 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:27.066563 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:27.066347 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert podName:83a219f3-ecef-475c-85b4-5e5f89df5b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.066331922 +0000 UTC m=+48.675470878 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert") pod "ingress-canary-blv55" (UID: "83a219f3-ecef-475c-85b4-5e5f89df5b6f") : secret "canary-serving-cert" not found Apr 24 21:16:27.066563 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:27.066267 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:27.066563 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:27.066426 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls podName:593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.066411631 +0000 UTC m=+48.675550606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls") pod "dns-default-p55cz" (UID: "593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c") : secret "dns-default-metrics-tls" not found Apr 24 21:16:29.198854 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:29.198820 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-svh9x" event={"ID":"64430b8d-991a-4176-a2b8-b3c3e21f20ba","Type":"ContainerStarted","Data":"202970e957f62b33697bff5a9f6fae60b3d1dbd4b349ae6ecb2f201f528dd007"} Apr 24 21:16:29.212815 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:29.212769 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-svh9x" podStartSLOduration=8.231021643 podStartE2EDuration="12.212753875s" podCreationTimestamp="2026-04-24 21:16:17 +0000 UTC" firstStartedPulling="2026-04-24 21:16:25.033380637 +0000 UTC m=+38.642519608" lastFinishedPulling="2026-04-24 21:16:29.015112872 +0000 UTC m=+42.624251840" observedRunningTime="2026-04-24 21:16:29.212545339 +0000 UTC m=+42.821684316" watchObservedRunningTime="2026-04-24 21:16:29.212753875 +0000 UTC m=+42.821892854" Apr 24 21:16:35.115825 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:35.115787 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:35.116357 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:35.115834 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:35.116357 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:35.115940 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:35.116357 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:35.115996 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert podName:83a219f3-ecef-475c-85b4-5e5f89df5b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.11598181 +0000 UTC m=+64.725120767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert") pod "ingress-canary-blv55" (UID: "83a219f3-ecef-475c-85b4-5e5f89df5b6f") : secret "canary-serving-cert" not found Apr 24 21:16:35.116357 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:35.115941 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:35.116357 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:35.116069 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls podName:593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.116057347 +0000 UTC m=+64.725196319 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls") pod "dns-default-p55cz" (UID: "593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c") : secret "dns-default-metrics-tls" not found Apr 24 21:16:45.169011 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:45.168984 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzbzn" Apr 24 21:16:51.126661 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.126614 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:16:51.126661 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.126663 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:16:51.127195 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:51.126759 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:51.127195 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:51.126764 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:51.127195 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:51.126822 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert podName:83a219f3-ecef-475c-85b4-5e5f89df5b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:17:23.126807379 +0000 UTC m=+96.735946335 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert") pod "ingress-canary-blv55" (UID: "83a219f3-ecef-475c-85b4-5e5f89df5b6f") : secret "canary-serving-cert" not found Apr 24 21:16:51.127195 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:51.126834 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls podName:593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c nodeName:}" failed. No retries permitted until 2026-04-24 21:17:23.126828988 +0000 UTC m=+96.735967945 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls") pod "dns-default-p55cz" (UID: "593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c") : secret "dns-default-metrics-tls" not found Apr 24 21:16:51.630078 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.630036 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:51.632598 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.632576 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.643031 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.643011 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.653805 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.653775 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2wmr\" (UniqueName: \"kubernetes.io/projected/0d188a02-fe28-4c44-96ea-c22a4f133693-kube-api-access-v2wmr\") pod \"network-check-target-w6vsz\" (UID: \"0d188a02-fe28-4c44-96ea-c22a4f133693\") " pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:51.731097 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.731059 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:16:51.733973 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.733953 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:16:51.741864 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:51.741845 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:16:51.741960 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:16:51.741905 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs podName:ab61ba5a-75b0-4d88-af4b-3e98166b3f50 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:55.741888761 +0000 UTC m=+129.351027719 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs") pod "network-metrics-daemon-mf254" (UID: "ab61ba5a-75b0-4d88-af4b-3e98166b3f50") : secret "metrics-daemon-secret" not found Apr 24 21:16:51.775966 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.775934 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m7v99\"" Apr 24 21:16:51.783889 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.783870 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:51.896349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:51.896319 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w6vsz"] Apr 24 21:16:51.899873 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:16:51.899844 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d188a02_fe28_4c44_96ea_c22a4f133693.slice/crio-a11106ef31a5d7bf1668d0204c83ea602c699e313f0c0930e26a2285888abc98 WatchSource:0}: Error finding container a11106ef31a5d7bf1668d0204c83ea602c699e313f0c0930e26a2285888abc98: Status 404 returned error can't find the container with id a11106ef31a5d7bf1668d0204c83ea602c699e313f0c0930e26a2285888abc98 Apr 24 21:16:52.241044 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:52.240965 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w6vsz" event={"ID":"0d188a02-fe28-4c44-96ea-c22a4f133693","Type":"ContainerStarted","Data":"a11106ef31a5d7bf1668d0204c83ea602c699e313f0c0930e26a2285888abc98"} Apr 24 21:16:55.248105 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:55.248069 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w6vsz" event={"ID":"0d188a02-fe28-4c44-96ea-c22a4f133693","Type":"ContainerStarted","Data":"16073c4130353539c8f66d9a5f83fbcbeec8e8d2ee41da36d8ddc3ffa8e26915"} Apr 24 21:16:55.248487 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:55.248193 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:16:55.268038 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:55.267975 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-w6vsz" podStartSLOduration=65.499211227 podStartE2EDuration="1m8.267957645s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:16:51.901708224 +0000 UTC m=+65.510847185" lastFinishedPulling="2026-04-24 21:16:54.670454642 +0000 UTC m=+68.279593603" observedRunningTime="2026-04-24 21:16:55.267575952 +0000 UTC m=+68.876714943" watchObservedRunningTime="2026-04-24 21:16:55.267957645 +0000 UTC m=+68.877096621" Apr 24 21:16:58.034283 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.034245 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6"] Apr 24 21:16:58.037236 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.037218 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" Apr 24 21:16:58.041094 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.041075 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-k6pqf\"" Apr 24 21:16:58.041207 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.041098 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:16:58.041552 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.041534 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:16:58.041655 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.041636 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:16:58.041725 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.041669 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:16:58.043994 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.043975 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j"] Apr 24 21:16:58.046897 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.046876 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.047555 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.047538 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6"] Apr 24 21:16:58.059798 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.059774 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:16:58.060294 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.060274 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:16:58.060470 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.060457 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:16:58.060534 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.060521 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:16:58.068611 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.068594 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j"] Apr 24 21:16:58.075436 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.075413 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-hub\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.075527 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.075444 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.075527 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.075467 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwb9\" (UniqueName: \"kubernetes.io/projected/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-kube-api-access-crwb9\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.075527 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.075484 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/661d0ab8-2100-4db6-958a-71ec686b96b0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-845b59fc6f-867v6\" (UID: \"661d0ab8-2100-4db6-958a-71ec686b96b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" Apr 24 21:16:58.075527 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.075500 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr2gf\" (UniqueName: \"kubernetes.io/projected/661d0ab8-2100-4db6-958a-71ec686b96b0-kube-api-access-tr2gf\") pod \"managed-serviceaccount-addon-agent-845b59fc6f-867v6\" (UID: \"661d0ab8-2100-4db6-958a-71ec686b96b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" Apr 24 21:16:58.075664 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.075573 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.075664 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.075597 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.075766 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.075665 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-ca\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.176581 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.176551 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-ca\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.176744 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.176593 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-hub\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.176744 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.176611 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.176744 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.176631 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crwb9\" (UniqueName: \"kubernetes.io/projected/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-kube-api-access-crwb9\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.176744 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.176656 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/661d0ab8-2100-4db6-958a-71ec686b96b0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-845b59fc6f-867v6\" (UID: \"661d0ab8-2100-4db6-958a-71ec686b96b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" Apr 24 21:16:58.176744 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.176675 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2gf\" (UniqueName: \"kubernetes.io/projected/661d0ab8-2100-4db6-958a-71ec686b96b0-kube-api-access-tr2gf\") pod \"managed-serviceaccount-addon-agent-845b59fc6f-867v6\" (UID: \"661d0ab8-2100-4db6-958a-71ec686b96b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" Apr 24 21:16:58.176744 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.176704 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.176744 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.176730 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.178433 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.178387 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.179086 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.179066 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-ca\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.179261 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.179235 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.179361 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.179344 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-hub\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.179729 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.179706 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/661d0ab8-2100-4db6-958a-71ec686b96b0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-845b59fc6f-867v6\" (UID: \"661d0ab8-2100-4db6-958a-71ec686b96b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" Apr 24 21:16:58.179862 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.179845 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.194019 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.193994 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwb9\" (UniqueName: \"kubernetes.io/projected/621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9-kube-api-access-crwb9\") pod \"cluster-proxy-proxy-agent-f85b8cc4b-dzc5j\" (UID: \"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.195216 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.195196 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2gf\" (UniqueName: \"kubernetes.io/projected/661d0ab8-2100-4db6-958a-71ec686b96b0-kube-api-access-tr2gf\") pod \"managed-serviceaccount-addon-agent-845b59fc6f-867v6\" (UID: \"661d0ab8-2100-4db6-958a-71ec686b96b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" Apr 24 21:16:58.364508 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.364384 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" Apr 24 21:16:58.372147 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.372121 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:16:58.487630 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.487600 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6"] Apr 24 21:16:58.492889 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:16:58.492864 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod661d0ab8_2100_4db6_958a_71ec686b96b0.slice/crio-9bfa1964bee4646493880acde3b662cbc581bdc4de98a0a319ff4d05bea25cbc WatchSource:0}: Error finding container 9bfa1964bee4646493880acde3b662cbc581bdc4de98a0a319ff4d05bea25cbc: Status 404 returned error can't find the container with id 9bfa1964bee4646493880acde3b662cbc581bdc4de98a0a319ff4d05bea25cbc Apr 24 21:16:58.505085 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:58.505058 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j"] Apr 24 21:16:58.507784 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:16:58.507754 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod621cd8c1_dfe4_4f4e_892a_2ef6aa4b3bc9.slice/crio-2c029597fd005bd16f6370a24e8027b6f9d174152656470ff53b85252c81ad32 WatchSource:0}: Error finding container 2c029597fd005bd16f6370a24e8027b6f9d174152656470ff53b85252c81ad32: Status 404 returned error can't find the container with id 2c029597fd005bd16f6370a24e8027b6f9d174152656470ff53b85252c81ad32 Apr 24 21:16:59.256874 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:59.256839 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" event={"ID":"661d0ab8-2100-4db6-958a-71ec686b96b0","Type":"ContainerStarted","Data":"9bfa1964bee4646493880acde3b662cbc581bdc4de98a0a319ff4d05bea25cbc"} Apr 24 21:16:59.258115 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:16:59.258086 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" event={"ID":"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9","Type":"ContainerStarted","Data":"2c029597fd005bd16f6370a24e8027b6f9d174152656470ff53b85252c81ad32"} Apr 24 21:17:02.266535 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:02.266496 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" event={"ID":"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9","Type":"ContainerStarted","Data":"6aa38f3d21ef9d69534ffeee3e253c112954898d061b93d751cb02b61500a833"} Apr 24 21:17:02.268093 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:02.268065 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" event={"ID":"661d0ab8-2100-4db6-958a-71ec686b96b0","Type":"ContainerStarted","Data":"c5ed8a1229413e8ff7397f7330debb6b8007d59b1b99ab05c0379592ca4ecab0"} Apr 24 21:17:02.285886 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:02.285827 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-845b59fc6f-867v6" podStartSLOduration=1.192408951 podStartE2EDuration="4.285809617s" podCreationTimestamp="2026-04-24 21:16:58 +0000 UTC" firstStartedPulling="2026-04-24 21:16:58.494650788 +0000 UTC m=+72.103789745" lastFinishedPulling="2026-04-24 21:17:01.588051454 +0000 UTC m=+75.197190411" observedRunningTime="2026-04-24 21:17:02.284275743 +0000 UTC m=+75.893414738" watchObservedRunningTime="2026-04-24 21:17:02.285809617 +0000 UTC m=+75.894948596" Apr 24 21:17:04.274669 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:04.274634 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" event={"ID":"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9","Type":"ContainerStarted","Data":"2e0febadf75eefcc1de1a87ec67e10d7035a301bc259d4705af1bd0f9259ee85"} Apr 24 21:17:04.274669 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:04.274672 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" event={"ID":"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9","Type":"ContainerStarted","Data":"b0152c74a9dfe3028c50d2113a77aa8a2c60bc2240efcc487e62982b16198316"} Apr 24 21:17:04.294559 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:04.294511 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" podStartSLOduration=1.269959578 podStartE2EDuration="6.29449691s" podCreationTimestamp="2026-04-24 21:16:58 +0000 UTC" firstStartedPulling="2026-04-24 21:16:58.509282734 +0000 UTC m=+72.118421691" lastFinishedPulling="2026-04-24 21:17:03.533820067 +0000 UTC m=+77.142959023" observedRunningTime="2026-04-24 21:17:04.293283734 +0000 UTC m=+77.902422710" watchObservedRunningTime="2026-04-24 21:17:04.29449691 +0000 UTC m=+77.903635889" Apr 24 21:17:23.148767 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:23.148728 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:17:23.149236 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:23.148791 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:17:23.149236 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:17:23.148882 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:23.149236 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:17:23.148915 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:23.149236 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:17:23.148970 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert podName:83a219f3-ecef-475c-85b4-5e5f89df5b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:18:27.148953534 +0000 UTC m=+160.758092496 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert") pod "ingress-canary-blv55" (UID: "83a219f3-ecef-475c-85b4-5e5f89df5b6f") : secret "canary-serving-cert" not found Apr 24 21:17:23.149236 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:17:23.148985 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls podName:593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c nodeName:}" failed. No retries permitted until 2026-04-24 21:18:27.14897854 +0000 UTC m=+160.758117497 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls") pod "dns-default-p55cz" (UID: "593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c") : secret "dns-default-metrics-tls" not found Apr 24 21:17:26.251810 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:26.251776 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-w6vsz" Apr 24 21:17:37.880011 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:37.879986 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tc6g6_a8617564-8813-486b-aaeb-9fd4ef61ca2f/dns-node-resolver/0.log" Apr 24 21:17:38.679243 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:38.679213 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-flzqj_8f5835d0-33c0-4340-bfe0-67872e19c79e/node-ca/0.log" Apr 24 21:17:55.780173 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:55.780132 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:17:55.782496 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:55.782472 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab61ba5a-75b0-4d88-af4b-3e98166b3f50-metrics-certs\") pod \"network-metrics-daemon-mf254\" (UID: \"ab61ba5a-75b0-4d88-af4b-3e98166b3f50\") " pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:17:55.961009 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:55.960975 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2rwk\"" Apr 24 21:17:55.969034 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:55.969014 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf254" Apr 24 21:17:56.082267 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:56.082238 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mf254"] Apr 24 21:17:56.085623 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:17:56.085593 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab61ba5a_75b0_4d88_af4b_3e98166b3f50.slice/crio-df62ab57572b3e7409f582ac562c0dac38c64ee654d51d1df423b5ca9d3271fe WatchSource:0}: Error finding container df62ab57572b3e7409f582ac562c0dac38c64ee654d51d1df423b5ca9d3271fe: Status 404 returned error can't find the container with id df62ab57572b3e7409f582ac562c0dac38c64ee654d51d1df423b5ca9d3271fe Apr 24 21:17:56.400129 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:56.400062 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mf254" event={"ID":"ab61ba5a-75b0-4d88-af4b-3e98166b3f50","Type":"ContainerStarted","Data":"df62ab57572b3e7409f582ac562c0dac38c64ee654d51d1df423b5ca9d3271fe"} Apr 24 21:17:57.404277 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:57.404246 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mf254" event={"ID":"ab61ba5a-75b0-4d88-af4b-3e98166b3f50","Type":"ContainerStarted","Data":"59b8815636fc491009669a2378657f89883be799066d5137d800015b150b1cbc"} Apr 24 21:17:57.404277 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:57.404283 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mf254" event={"ID":"ab61ba5a-75b0-4d88-af4b-3e98166b3f50","Type":"ContainerStarted","Data":"9d0f18da0c6677b949d98e253e8f45615a7ccd262bfa6dd06e1574b1a6d19f42"} Apr 24 21:17:57.421722 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:17:57.421670 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mf254" podStartSLOduration=129.564262458 podStartE2EDuration="2m10.421645783s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:17:56.087451123 +0000 UTC m=+129.696590101" lastFinishedPulling="2026-04-24 21:17:56.944834469 +0000 UTC m=+130.553973426" observedRunningTime="2026-04-24 21:17:57.421032141 +0000 UTC m=+131.030171121" watchObservedRunningTime="2026-04-24 21:17:57.421645783 +0000 UTC m=+131.030784764" Apr 24 21:18:03.100560 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.100526 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-n68s2"] Apr 24 21:18:03.103564 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.103549 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.107050 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.107028 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:18:03.108382 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.108362 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:18:03.108382 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.108379 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:18:03.108566 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.108414 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:18:03.108566 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.108489 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hxfzx\"" Apr 24 21:18:03.120852 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.120834 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n68s2"] Apr 24 21:18:03.207276 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.207240 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-556844449-86knp"] Apr 24 21:18:03.210102 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.210087 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.213091 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.213072 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:18:03.213205 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.213116 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:18:03.213723 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.213707 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:18:03.217469 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.217452 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bc8qn\"" Apr 24 21:18:03.232964 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.232939 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-data-volume\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.233056 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.232969 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.233056 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.232990 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-crio-socket\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.233056 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.233009 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.233163 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.233052 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sgw5\" (UniqueName: \"kubernetes.io/projected/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-kube-api-access-7sgw5\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.239748 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.239730 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:18:03.256467 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.256425 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-556844449-86knp"] Apr 24 21:18:03.333699 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.333670 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e37e936-94ab-4cbe-bc5b-157388985568-trusted-ca\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.333952 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.333727 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sfch\" (UniqueName: \"kubernetes.io/projected/8e37e936-94ab-4cbe-bc5b-157388985568-kube-api-access-2sfch\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.333952 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.333755 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e37e936-94ab-4cbe-bc5b-157388985568-bound-sa-token\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.333952 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.333775 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e37e936-94ab-4cbe-bc5b-157388985568-installation-pull-secrets\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.333952 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.333816 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-data-volume\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.333952 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.333870 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.334245 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334069 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.334245 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334113 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-crio-socket\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.334245 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334149 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e37e936-94ab-4cbe-bc5b-157388985568-registry-tls\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.334245 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334174 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e37e936-94ab-4cbe-bc5b-157388985568-ca-trust-extracted\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.334245 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334202 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e37e936-94ab-4cbe-bc5b-157388985568-image-registry-private-configuration\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.334526 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334256 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sgw5\" (UniqueName: \"kubernetes.io/projected/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-kube-api-access-7sgw5\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.334526 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334288 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e37e936-94ab-4cbe-bc5b-157388985568-registry-certificates\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.334526 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334514 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.334665 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334605 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-crio-socket\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.334827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.334799 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-data-volume\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.337274 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.337257 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.346611 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.346590 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sgw5\" (UniqueName: \"kubernetes.io/projected/2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b-kube-api-access-7sgw5\") pod \"insights-runtime-extractor-n68s2\" (UID: \"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b\") " pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.412617 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.412584 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n68s2" Apr 24 21:18:03.434820 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.434793 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e37e936-94ab-4cbe-bc5b-157388985568-registry-certificates\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.434940 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.434830 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e37e936-94ab-4cbe-bc5b-157388985568-trusted-ca\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.434940 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.434869 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sfch\" (UniqueName: \"kubernetes.io/projected/8e37e936-94ab-4cbe-bc5b-157388985568-kube-api-access-2sfch\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.434940 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.434886 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e37e936-94ab-4cbe-bc5b-157388985568-bound-sa-token\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.434940 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.434903 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e37e936-94ab-4cbe-bc5b-157388985568-installation-pull-secrets\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.435148 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.434992 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e37e936-94ab-4cbe-bc5b-157388985568-registry-tls\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.435148 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.435017 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e37e936-94ab-4cbe-bc5b-157388985568-ca-trust-extracted\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.435148 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.435042 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e37e936-94ab-4cbe-bc5b-157388985568-image-registry-private-configuration\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.435637 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.435608 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e37e936-94ab-4cbe-bc5b-157388985568-registry-certificates\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.435761 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.435654 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e37e936-94ab-4cbe-bc5b-157388985568-ca-trust-extracted\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.435958 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.435938 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e37e936-94ab-4cbe-bc5b-157388985568-trusted-ca\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.438098 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.438079 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e37e936-94ab-4cbe-bc5b-157388985568-installation-pull-secrets\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.438232 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.438211 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e37e936-94ab-4cbe-bc5b-157388985568-registry-tls\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.438232 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.438216 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e37e936-94ab-4cbe-bc5b-157388985568-image-registry-private-configuration\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.455049 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.455027 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e37e936-94ab-4cbe-bc5b-157388985568-bound-sa-token\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.457237 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.457210 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sfch\" (UniqueName: \"kubernetes.io/projected/8e37e936-94ab-4cbe-bc5b-157388985568-kube-api-access-2sfch\") pod \"image-registry-556844449-86knp\" (UID: \"8e37e936-94ab-4cbe-bc5b-157388985568\") " pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.518421 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.518377 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:03.543104 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.541661 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n68s2"] Apr 24 21:18:03.670139 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:03.670043 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-556844449-86knp"] Apr 24 21:18:03.674342 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:18:03.674310 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e37e936_94ab_4cbe_bc5b_157388985568.slice/crio-1eccb5aebf300fa9976f5a6b58dba1ee8e780eea8815c9518943d1e87a890674 WatchSource:0}: Error finding container 1eccb5aebf300fa9976f5a6b58dba1ee8e780eea8815c9518943d1e87a890674: Status 404 returned error can't find the container with id 1eccb5aebf300fa9976f5a6b58dba1ee8e780eea8815c9518943d1e87a890674 Apr 24 21:18:04.421623 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:04.421529 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-556844449-86knp" event={"ID":"8e37e936-94ab-4cbe-bc5b-157388985568","Type":"ContainerStarted","Data":"c0a820f96a2af4e8a646994aaf9eb5954a1395d7f07d087f049e0a4d33bf9148"} Apr 24 21:18:04.421623 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:04.421573 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-556844449-86knp" event={"ID":"8e37e936-94ab-4cbe-bc5b-157388985568","Type":"ContainerStarted","Data":"1eccb5aebf300fa9976f5a6b58dba1ee8e780eea8815c9518943d1e87a890674"} Apr 24 21:18:04.422138 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:04.421697 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:04.423082 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:04.423058 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n68s2" event={"ID":"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b","Type":"ContainerStarted","Data":"659ea1850b85dcc762097cee35e3eeff9ab3617f9a1f9d7989ba80c4f8a91881"} Apr 24 21:18:04.423199 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:04.423086 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n68s2" event={"ID":"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b","Type":"ContainerStarted","Data":"23cbb72a0d6b471e178737515cb3bbec800c5f3e2b5b95ee6f384c69daeacb4b"} Apr 24 21:18:04.423199 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:04.423101 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n68s2" event={"ID":"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b","Type":"ContainerStarted","Data":"67ee189c7515a77c6bdcaeffcbd49085e7124142a4ff1f6b4fe6ef0a957bb49c"} Apr 24 21:18:04.444778 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:04.444733 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-556844449-86knp" podStartSLOduration=1.444717527 podStartE2EDuration="1.444717527s" podCreationTimestamp="2026-04-24 21:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:18:04.44364203 +0000 UTC m=+138.052781009" watchObservedRunningTime="2026-04-24 21:18:04.444717527 +0000 UTC m=+138.053856484" Apr 24 21:18:06.430121 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:06.430080 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n68s2" event={"ID":"2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b","Type":"ContainerStarted","Data":"a6a0ed057437cdf5d9790be6d6f6476935b1749d7458eb5306b6cce0e57adc32"} Apr 24 21:18:06.455487 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:06.455437 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-n68s2" podStartSLOduration=1.6144667350000002 podStartE2EDuration="3.455422497s" podCreationTimestamp="2026-04-24 21:18:03 +0000 UTC" firstStartedPulling="2026-04-24 21:18:03.604897136 +0000 UTC m=+137.214036094" lastFinishedPulling="2026-04-24 21:18:05.445852898 +0000 UTC m=+139.054991856" observedRunningTime="2026-04-24 21:18:06.454435179 +0000 UTC m=+140.063574157" watchObservedRunningTime="2026-04-24 21:18:06.455422497 +0000 UTC m=+140.064561478" Apr 24 21:18:10.553106 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.553070 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-sn52d"] Apr 24 21:18:10.557916 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.557892 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.560181 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.560151 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:18:10.560280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.560209 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:18:10.560280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.560157 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:18:10.560280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.560246 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:18:10.560280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.560264 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-f2sfg\"" Apr 24 21:18:10.560487 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.560268 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:18:10.560970 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.560955 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:18:10.690814 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.690763 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.690814 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.690822 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9bnq\" (UniqueName: \"kubernetes.io/projected/c603a9c8-437b-42f3-960b-865acebe96ec-kube-api-access-m9bnq\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.691028 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.690863 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-accelerators-collector-config\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.691028 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.690889 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c603a9c8-437b-42f3-960b-865acebe96ec-root\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.691028 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.690909 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c603a9c8-437b-42f3-960b-865acebe96ec-sys\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.691028 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.690937 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-tls\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.691028 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.690953 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-wtmp\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.691028 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.690978 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-textfile\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.691228 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.691024 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c603a9c8-437b-42f3-960b-865acebe96ec-metrics-client-ca\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792169 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792141 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-tls\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792169 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792171 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-wtmp\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792428 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792192 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-textfile\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792428 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792211 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c603a9c8-437b-42f3-960b-865acebe96ec-metrics-client-ca\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792428 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:18:10.792305 2581 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:18:10.792428 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792354 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792428 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792355 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-wtmp\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792428 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:18:10.792376 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-tls podName:c603a9c8-437b-42f3-960b-865acebe96ec nodeName:}" failed. No retries permitted until 2026-04-24 21:18:11.292354913 +0000 UTC m=+144.901493871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-tls") pod "node-exporter-sn52d" (UID: "c603a9c8-437b-42f3-960b-865acebe96ec") : secret "node-exporter-tls" not found Apr 24 21:18:10.792737 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792455 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9bnq\" (UniqueName: \"kubernetes.io/projected/c603a9c8-437b-42f3-960b-865acebe96ec-kube-api-access-m9bnq\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792737 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792490 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-accelerators-collector-config\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792737 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792523 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c603a9c8-437b-42f3-960b-865acebe96ec-root\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792737 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792563 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c603a9c8-437b-42f3-960b-865acebe96ec-sys\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792737 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792585 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-textfile\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792737 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792649 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c603a9c8-437b-42f3-960b-865acebe96ec-sys\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.792737 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792645 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c603a9c8-437b-42f3-960b-865acebe96ec-root\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.793069 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.792951 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c603a9c8-437b-42f3-960b-865acebe96ec-metrics-client-ca\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.793162 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.793143 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-accelerators-collector-config\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.794625 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.794604 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:10.800817 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:10.800793 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9bnq\" (UniqueName: \"kubernetes.io/projected/c603a9c8-437b-42f3-960b-865acebe96ec-kube-api-access-m9bnq\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:11.296986 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:11.296950 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-tls\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:11.299202 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:11.299184 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c603a9c8-437b-42f3-960b-865acebe96ec-node-exporter-tls\") pod \"node-exporter-sn52d\" (UID: \"c603a9c8-437b-42f3-960b-865acebe96ec\") " pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:11.466907 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:11.466866 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sn52d" Apr 24 21:18:11.474740 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:18:11.474714 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc603a9c8_437b_42f3_960b_865acebe96ec.slice/crio-afb46e2863780aedd6be7b688ad42947f58b324a5d8ad460050c04424ad78ea1 WatchSource:0}: Error finding container afb46e2863780aedd6be7b688ad42947f58b324a5d8ad460050c04424ad78ea1: Status 404 returned error can't find the container with id afb46e2863780aedd6be7b688ad42947f58b324a5d8ad460050c04424ad78ea1 Apr 24 21:18:12.445098 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:12.445014 2581 generic.go:358] "Generic (PLEG): container finished" podID="c603a9c8-437b-42f3-960b-865acebe96ec" containerID="eb37dcf1e3ed07f78c96d8c51a73254b10e0ada569b73967b149e04f52cc05bf" exitCode=0 Apr 24 21:18:12.445098 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:12.445064 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sn52d" event={"ID":"c603a9c8-437b-42f3-960b-865acebe96ec","Type":"ContainerDied","Data":"eb37dcf1e3ed07f78c96d8c51a73254b10e0ada569b73967b149e04f52cc05bf"} Apr 24 21:18:12.445098 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:12.445090 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sn52d" event={"ID":"c603a9c8-437b-42f3-960b-865acebe96ec","Type":"ContainerStarted","Data":"afb46e2863780aedd6be7b688ad42947f58b324a5d8ad460050c04424ad78ea1"} Apr 24 21:18:13.449138 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:13.449106 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sn52d" event={"ID":"c603a9c8-437b-42f3-960b-865acebe96ec","Type":"ContainerStarted","Data":"62802df16304f18ffdc676158a82376f2b3db28ba78a62e8dd8ea829da57befe"} Apr 24 21:18:13.449138 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:13.449146 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sn52d" event={"ID":"c603a9c8-437b-42f3-960b-865acebe96ec","Type":"ContainerStarted","Data":"61e4f089e16a0902d867177d139f0429fe3849ed4cd90bf7db76dfc54527e7b0"} Apr 24 21:18:13.468138 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:13.468083 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-sn52d" podStartSLOduration=2.757826873 podStartE2EDuration="3.468066069s" podCreationTimestamp="2026-04-24 21:18:10 +0000 UTC" firstStartedPulling="2026-04-24 21:18:11.476511996 +0000 UTC m=+145.085650957" lastFinishedPulling="2026-04-24 21:18:12.186751196 +0000 UTC m=+145.795890153" observedRunningTime="2026-04-24 21:18:13.467063973 +0000 UTC m=+147.076202952" watchObservedRunningTime="2026-04-24 21:18:13.468066069 +0000 UTC m=+147.077205051" Apr 24 21:18:18.373111 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:18.373039 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" podUID="621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:18:22.257577 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:18:22.257532 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p55cz" podUID="593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c" Apr 24 21:18:22.289518 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:18:22.289484 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-blv55" podUID="83a219f3-ecef-475c-85b4-5e5f89df5b6f" Apr 24 21:18:22.470293 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:22.470264 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p55cz" Apr 24 21:18:22.470293 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:22.470300 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:18:25.429850 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:25.429819 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-556844449-86knp" Apr 24 21:18:27.218535 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.218500 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:18:27.218535 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.218539 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:18:27.220812 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.220782 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c-metrics-tls\") pod \"dns-default-p55cz\" (UID: \"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c\") " pod="openshift-dns/dns-default-p55cz" Apr 24 21:18:27.220965 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.220948 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83a219f3-ecef-475c-85b4-5e5f89df5b6f-cert\") pod \"ingress-canary-blv55\" (UID: \"83a219f3-ecef-475c-85b4-5e5f89df5b6f\") " pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:18:27.272995 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.272969 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-txjjd\"" Apr 24 21:18:27.273380 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.273365 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qvvhg\"" Apr 24 21:18:27.281287 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.281260 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p55cz" Apr 24 21:18:27.281350 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.281333 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-blv55" Apr 24 21:18:27.411013 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.410961 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p55cz"] Apr 24 21:18:27.415050 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:18:27.415020 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod593945d9_3e3d_4c47_ad8e_c3ce4c6dd77c.slice/crio-6e3fd9a98e5e5a28dc0cdaa7238f29c34ccbdafdd0d35d8edc61907e9ade8f1f WatchSource:0}: Error finding container 6e3fd9a98e5e5a28dc0cdaa7238f29c34ccbdafdd0d35d8edc61907e9ade8f1f: Status 404 returned error can't find the container with id 6e3fd9a98e5e5a28dc0cdaa7238f29c34ccbdafdd0d35d8edc61907e9ade8f1f Apr 24 21:18:27.435488 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.435464 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-blv55"] Apr 24 21:18:27.440235 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:18:27.440207 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a219f3_ecef_475c_85b4_5e5f89df5b6f.slice/crio-a5786e490c6e1a139a3984ca182e08a9398de1b643783222764f8216e561515b WatchSource:0}: Error finding container a5786e490c6e1a139a3984ca182e08a9398de1b643783222764f8216e561515b: Status 404 returned error can't find the container with id a5786e490c6e1a139a3984ca182e08a9398de1b643783222764f8216e561515b Apr 24 21:18:27.482363 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.482278 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-blv55" event={"ID":"83a219f3-ecef-475c-85b4-5e5f89df5b6f","Type":"ContainerStarted","Data":"a5786e490c6e1a139a3984ca182e08a9398de1b643783222764f8216e561515b"} Apr 24 21:18:27.483204 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:27.483172 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p55cz" event={"ID":"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c","Type":"ContainerStarted","Data":"6e3fd9a98e5e5a28dc0cdaa7238f29c34ccbdafdd0d35d8edc61907e9ade8f1f"} Apr 24 21:18:28.373164 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:28.373123 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" podUID="621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:18:29.490282 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:29.490138 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p55cz" event={"ID":"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c","Type":"ContainerStarted","Data":"cbcb5e5b8bac65a7a3e92ab5456afdb70c6e4760e9c5d746f8deec69e6c4d709"} Apr 24 21:18:29.490282 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:29.490177 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p55cz" event={"ID":"593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c","Type":"ContainerStarted","Data":"20a8b620236e7092761670bedfa136f29ff877153ae6d356682e46b040791fae"} Apr 24 21:18:29.490783 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:29.490298 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p55cz" Apr 24 21:18:29.491680 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:29.491658 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-blv55" event={"ID":"83a219f3-ecef-475c-85b4-5e5f89df5b6f","Type":"ContainerStarted","Data":"6c9b5c7f6cc5ee39e9ca72558b622d01fed72894081b75e21b73c84608315190"} Apr 24 21:18:29.507902 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:29.507849 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p55cz" podStartSLOduration=128.69599262 podStartE2EDuration="2m10.507816971s" podCreationTimestamp="2026-04-24 21:16:19 +0000 UTC" firstStartedPulling="2026-04-24 21:18:27.416916936 +0000 UTC m=+161.026055893" lastFinishedPulling="2026-04-24 21:18:29.22874127 +0000 UTC m=+162.837880244" observedRunningTime="2026-04-24 21:18:29.506517447 +0000 UTC m=+163.115656425" watchObservedRunningTime="2026-04-24 21:18:29.507816971 +0000 UTC m=+163.116955952" Apr 24 21:18:29.521647 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:29.521594 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-blv55" podStartSLOduration=128.730586498 podStartE2EDuration="2m10.521579128s" podCreationTimestamp="2026-04-24 21:16:19 +0000 UTC" firstStartedPulling="2026-04-24 21:18:27.442145616 +0000 UTC m=+161.051284580" lastFinishedPulling="2026-04-24 21:18:29.233138239 +0000 UTC m=+162.842277210" observedRunningTime="2026-04-24 21:18:29.521254552 +0000 UTC m=+163.130393531" watchObservedRunningTime="2026-04-24 21:18:29.521579128 +0000 UTC m=+163.130718109" Apr 24 21:18:38.373841 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:38.373758 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" podUID="621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:18:38.373841 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:38.373829 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" Apr 24 21:18:38.374424 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:38.374369 2581 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"2e0febadf75eefcc1de1a87ec67e10d7035a301bc259d4705af1bd0f9259ee85"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:18:38.374502 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:38.374486 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" podUID="621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9" containerName="service-proxy" containerID="cri-o://2e0febadf75eefcc1de1a87ec67e10d7035a301bc259d4705af1bd0f9259ee85" gracePeriod=30 Apr 24 21:18:38.516306 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:38.516278 2581 generic.go:358] "Generic (PLEG): container finished" podID="621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9" containerID="2e0febadf75eefcc1de1a87ec67e10d7035a301bc259d4705af1bd0f9259ee85" exitCode=2 Apr 24 21:18:38.516423 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:38.516349 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" event={"ID":"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9","Type":"ContainerDied","Data":"2e0febadf75eefcc1de1a87ec67e10d7035a301bc259d4705af1bd0f9259ee85"} Apr 24 21:18:39.496230 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:39.496197 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p55cz" Apr 24 21:18:39.521723 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:39.521682 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f85b8cc4b-dzc5j" event={"ID":"621cd8c1-dfe4-4f4e-892a-2ef6aa4b3bc9","Type":"ContainerStarted","Data":"78d1ca30cffb9d64b0d444b5af50b64b23fdf4e5f60a0d3338004ae70893dd11"} Apr 24 21:18:59.497902 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:59.497874 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sn52d_c603a9c8-437b-42f3-960b-865acebe96ec/init-textfile/0.log" Apr 24 21:18:59.698678 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:59.698625 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sn52d_c603a9c8-437b-42f3-960b-865acebe96ec/node-exporter/0.log" Apr 24 21:18:59.897602 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:18:59.897572 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sn52d_c603a9c8-437b-42f3-960b-865acebe96ec/kube-rbac-proxy/0.log" Apr 24 21:20:46.868000 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:20:46.867959 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:20:46.868487 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:20:46.868015 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:20:46.871797 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:20:46.871778 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:21:51.695723 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.695688 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt"] Apr 24 21:21:51.698767 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.698749 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:51.701087 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.701068 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:21:51.701704 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.701687 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bq5mc\"" Apr 24 21:21:51.701767 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.701707 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:21:51.707501 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.707479 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt"] Apr 24 21:21:51.825949 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.825901 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:51.825949 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.825948 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:51.826167 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.826010 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk7fr\" (UniqueName: \"kubernetes.io/projected/27e9c245-1d78-426f-bdcf-1cbd662af83a-kube-api-access-wk7fr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:51.926342 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.926304 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:51.926342 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.926344 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:51.926551 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.926374 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk7fr\" (UniqueName: \"kubernetes.io/projected/27e9c245-1d78-426f-bdcf-1cbd662af83a-kube-api-access-wk7fr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:51.926705 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.926687 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:51.926777 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.926754 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:51.935561 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:51.935535 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk7fr\" (UniqueName: \"kubernetes.io/projected/27e9c245-1d78-426f-bdcf-1cbd662af83a-kube-api-access-wk7fr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:52.008198 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:52.008105 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:21:52.124437 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:52.124382 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt"] Apr 24 21:21:52.128571 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:21:52.128545 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27e9c245_1d78_426f_bdcf_1cbd662af83a.slice/crio-0214148f307cb95b89eeb0ddd4b329ab528176c64040db6ffd5f98cc5618253f WatchSource:0}: Error finding container 0214148f307cb95b89eeb0ddd4b329ab528176c64040db6ffd5f98cc5618253f: Status 404 returned error can't find the container with id 0214148f307cb95b89eeb0ddd4b329ab528176c64040db6ffd5f98cc5618253f Apr 24 21:21:52.134491 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:52.134470 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:21:52.999165 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:52.999123 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" event={"ID":"27e9c245-1d78-426f-bdcf-1cbd662af83a","Type":"ContainerStarted","Data":"0214148f307cb95b89eeb0ddd4b329ab528176c64040db6ffd5f98cc5618253f"} Apr 24 21:21:58.014751 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:58.014719 2581 generic.go:358] "Generic (PLEG): container finished" podID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerID="b015b13cce8f5dd329ab3d43f4d6ca8283e6c5d379afb5bf425c505b455f5c3e" exitCode=0 Apr 24 21:21:58.015161 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:21:58.014806 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" event={"ID":"27e9c245-1d78-426f-bdcf-1cbd662af83a","Type":"ContainerDied","Data":"b015b13cce8f5dd329ab3d43f4d6ca8283e6c5d379afb5bf425c505b455f5c3e"} Apr 24 21:22:04.032280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:04.032250 2581 generic.go:358] "Generic (PLEG): container finished" podID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerID="9814e5693eee742e48f3a2968f6257bed7e567cddccd31fa0daf58a86ac2e550" exitCode=0 Apr 24 21:22:04.032619 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:04.032315 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" event={"ID":"27e9c245-1d78-426f-bdcf-1cbd662af83a","Type":"ContainerDied","Data":"9814e5693eee742e48f3a2968f6257bed7e567cddccd31fa0daf58a86ac2e550"} Apr 24 21:22:12.056076 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:12.056038 2581 generic.go:358] "Generic (PLEG): container finished" podID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerID="1a95ab0500660164a6390be6690e8011a45a4aff810f39b1bd9c2b82232e9484" exitCode=0 Apr 24 21:22:12.056469 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:12.056112 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" event={"ID":"27e9c245-1d78-426f-bdcf-1cbd662af83a","Type":"ContainerDied","Data":"1a95ab0500660164a6390be6690e8011a45a4aff810f39b1bd9c2b82232e9484"} Apr 24 21:22:13.174761 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.174735 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:22:13.299230 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.299196 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk7fr\" (UniqueName: \"kubernetes.io/projected/27e9c245-1d78-426f-bdcf-1cbd662af83a-kube-api-access-wk7fr\") pod \"27e9c245-1d78-426f-bdcf-1cbd662af83a\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " Apr 24 21:22:13.299449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.299240 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-bundle\") pod \"27e9c245-1d78-426f-bdcf-1cbd662af83a\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " Apr 24 21:22:13.299449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.299287 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-util\") pod \"27e9c245-1d78-426f-bdcf-1cbd662af83a\" (UID: \"27e9c245-1d78-426f-bdcf-1cbd662af83a\") " Apr 24 21:22:13.299881 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.299854 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-bundle" (OuterVolumeSpecName: "bundle") pod "27e9c245-1d78-426f-bdcf-1cbd662af83a" (UID: "27e9c245-1d78-426f-bdcf-1cbd662af83a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:22:13.301502 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.301477 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e9c245-1d78-426f-bdcf-1cbd662af83a-kube-api-access-wk7fr" (OuterVolumeSpecName: "kube-api-access-wk7fr") pod "27e9c245-1d78-426f-bdcf-1cbd662af83a" (UID: "27e9c245-1d78-426f-bdcf-1cbd662af83a"). InnerVolumeSpecName "kube-api-access-wk7fr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:22:13.303562 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.303540 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-util" (OuterVolumeSpecName: "util") pod "27e9c245-1d78-426f-bdcf-1cbd662af83a" (UID: "27e9c245-1d78-426f-bdcf-1cbd662af83a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:22:13.400634 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.400597 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wk7fr\" (UniqueName: \"kubernetes.io/projected/27e9c245-1d78-426f-bdcf-1cbd662af83a-kube-api-access-wk7fr\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:22:13.400634 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.400630 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-bundle\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:22:13.400634 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:13.400639 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e9c245-1d78-426f-bdcf-1cbd662af83a-util\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:22:14.062833 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:14.062798 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" event={"ID":"27e9c245-1d78-426f-bdcf-1cbd662af83a","Type":"ContainerDied","Data":"0214148f307cb95b89eeb0ddd4b329ab528176c64040db6ffd5f98cc5618253f"} Apr 24 21:22:14.062833 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:14.062831 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0214148f307cb95b89eeb0ddd4b329ab528176c64040db6ffd5f98cc5618253f" Apr 24 21:22:14.063034 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:14.062844 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c258qt" Apr 24 21:22:18.400667 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.400631 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd"] Apr 24 21:22:18.401036 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.400975 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerName="pull" Apr 24 21:22:18.401036 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.400990 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerName="pull" Apr 24 21:22:18.401036 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.401006 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerName="util" Apr 24 21:22:18.401036 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.401016 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerName="util" Apr 24 21:22:18.401036 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.401034 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerName="extract" Apr 24 21:22:18.401187 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.401042 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerName="extract" Apr 24 21:22:18.401187 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.401094 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="27e9c245-1d78-426f-bdcf-1cbd662af83a" containerName="extract" Apr 24 21:22:18.404328 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.404312 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:18.406648 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.406621 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:22:18.406769 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.406741 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-gmgx2\"" Apr 24 21:22:18.406823 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.406799 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:22:18.406915 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.406899 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:22:18.411770 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.411746 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd"] Apr 24 21:22:18.536709 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.536674 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcq2w\" (UniqueName: \"kubernetes.io/projected/27de9416-e859-4541-8de3-99aea2df99f4-kube-api-access-zcq2w\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-27zdd\" (UID: \"27de9416-e859-4541-8de3-99aea2df99f4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:18.536854 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.536726 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/27de9416-e859-4541-8de3-99aea2df99f4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-27zdd\" (UID: \"27de9416-e859-4541-8de3-99aea2df99f4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:18.637988 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.637940 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/27de9416-e859-4541-8de3-99aea2df99f4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-27zdd\" (UID: \"27de9416-e859-4541-8de3-99aea2df99f4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:18.638136 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.638039 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcq2w\" (UniqueName: \"kubernetes.io/projected/27de9416-e859-4541-8de3-99aea2df99f4-kube-api-access-zcq2w\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-27zdd\" (UID: \"27de9416-e859-4541-8de3-99aea2df99f4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:18.640279 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.640257 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/27de9416-e859-4541-8de3-99aea2df99f4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-27zdd\" (UID: \"27de9416-e859-4541-8de3-99aea2df99f4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:18.646111 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.646080 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcq2w\" (UniqueName: \"kubernetes.io/projected/27de9416-e859-4541-8de3-99aea2df99f4-kube-api-access-zcq2w\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-27zdd\" (UID: \"27de9416-e859-4541-8de3-99aea2df99f4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:18.715167 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.715078 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:18.832922 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:18.832886 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd"] Apr 24 21:22:18.836258 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:22:18.836231 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27de9416_e859_4541_8de3_99aea2df99f4.slice/crio-15adc1f3d5fc34dea6cfc767f78433cba80cd8dc5c90abd4cdb64dc92e0a751d WatchSource:0}: Error finding container 15adc1f3d5fc34dea6cfc767f78433cba80cd8dc5c90abd4cdb64dc92e0a751d: Status 404 returned error can't find the container with id 15adc1f3d5fc34dea6cfc767f78433cba80cd8dc5c90abd4cdb64dc92e0a751d Apr 24 21:22:19.079704 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:19.079621 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" event={"ID":"27de9416-e859-4541-8de3-99aea2df99f4","Type":"ContainerStarted","Data":"15adc1f3d5fc34dea6cfc767f78433cba80cd8dc5c90abd4cdb64dc92e0a751d"} Apr 24 21:22:22.089865 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.089832 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" event={"ID":"27de9416-e859-4541-8de3-99aea2df99f4","Type":"ContainerStarted","Data":"535e364632cb5c8fc86d2162ece6196952a8f0614b345a647af970ada57da89e"} Apr 24 21:22:22.090282 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.089943 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:22.109157 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.109106 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" podStartSLOduration=1.048971377 podStartE2EDuration="4.109089768s" podCreationTimestamp="2026-04-24 21:22:18 +0000 UTC" firstStartedPulling="2026-04-24 21:22:18.837872335 +0000 UTC m=+392.447011296" lastFinishedPulling="2026-04-24 21:22:21.897990731 +0000 UTC m=+395.507129687" observedRunningTime="2026-04-24 21:22:22.107522482 +0000 UTC m=+395.716661462" watchObservedRunningTime="2026-04-24 21:22:22.109089768 +0000 UTC m=+395.718228748" Apr 24 21:22:22.692688 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.692650 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst"] Apr 24 21:22:22.695669 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.695647 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:22.697936 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.697913 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:22:22.698039 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.697920 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-64sjn\"" Apr 24 21:22:22.698039 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.697967 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:22:22.705652 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.705633 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst"] Apr 24 21:22:22.872570 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.872534 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:22.872734 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.872579 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:22.872734 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.872602 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4j4x\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-kube-api-access-h4j4x\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:22.973866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.973789 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:22.973866 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.973841 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:22.974070 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:22.973943 2581 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:22:22.974070 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:22.973958 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:22:22.974070 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:22.973978 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst: references non-existent secret key: tls.crt Apr 24 21:22:22.974070 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.974010 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4j4x\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-kube-api-access-h4j4x\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:22.974070 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:22.974032 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates podName:ac5ef66c-66c6-44c0-87f4-b7edbac7dd85 nodeName:}" failed. No retries permitted until 2026-04-24 21:22:23.474013438 +0000 UTC m=+397.083152408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates") pod "keda-metrics-apiserver-7c9f485588-jvzst" (UID: "ac5ef66c-66c6-44c0-87f4-b7edbac7dd85") : references non-existent secret key: tls.crt Apr 24 21:22:22.974289 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.974215 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:22.987200 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:22.987167 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4j4x\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-kube-api-access-h4j4x\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:23.020886 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.019133 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-tmffv"] Apr 24 21:22:23.023451 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.023427 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:22:23.025679 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.025658 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:22:23.029325 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.029298 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tmffv"] Apr 24 21:22:23.175470 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.175436 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bft6\" (UniqueName: \"kubernetes.io/projected/dea1515b-0167-4e4b-8f70-6abd4be0a28e-kube-api-access-2bft6\") pod \"keda-admission-cf49989db-tmffv\" (UID: \"dea1515b-0167-4e4b-8f70-6abd4be0a28e\") " pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:22:23.175470 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.175481 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dea1515b-0167-4e4b-8f70-6abd4be0a28e-certificates\") pod \"keda-admission-cf49989db-tmffv\" (UID: \"dea1515b-0167-4e4b-8f70-6abd4be0a28e\") " pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:22:23.277026 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.276933 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bft6\" (UniqueName: \"kubernetes.io/projected/dea1515b-0167-4e4b-8f70-6abd4be0a28e-kube-api-access-2bft6\") pod \"keda-admission-cf49989db-tmffv\" (UID: \"dea1515b-0167-4e4b-8f70-6abd4be0a28e\") " pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:22:23.277026 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.276973 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dea1515b-0167-4e4b-8f70-6abd4be0a28e-certificates\") pod \"keda-admission-cf49989db-tmffv\" (UID: \"dea1515b-0167-4e4b-8f70-6abd4be0a28e\") " pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:22:23.279303 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.279281 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dea1515b-0167-4e4b-8f70-6abd4be0a28e-certificates\") pod \"keda-admission-cf49989db-tmffv\" (UID: \"dea1515b-0167-4e4b-8f70-6abd4be0a28e\") " pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:22:23.286084 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.286062 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bft6\" (UniqueName: \"kubernetes.io/projected/dea1515b-0167-4e4b-8f70-6abd4be0a28e-kube-api-access-2bft6\") pod \"keda-admission-cf49989db-tmffv\" (UID: \"dea1515b-0167-4e4b-8f70-6abd4be0a28e\") " pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:22:23.335427 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.335381 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:22:23.456306 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.456275 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tmffv"] Apr 24 21:22:23.460589 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:22:23.460563 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea1515b_0167_4e4b_8f70_6abd4be0a28e.slice/crio-19c253b47e95aa95770ffe73d5bac55dc473ad71e04a74915e97831d929d830a WatchSource:0}: Error finding container 19c253b47e95aa95770ffe73d5bac55dc473ad71e04a74915e97831d929d830a: Status 404 returned error can't find the container with id 19c253b47e95aa95770ffe73d5bac55dc473ad71e04a74915e97831d929d830a Apr 24 21:22:23.478856 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:23.478836 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:23.478963 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:23.478943 2581 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:22:23.478963 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:23.478955 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:22:23.479043 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:23.478971 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst: references non-existent secret key: tls.crt Apr 24 21:22:23.479043 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:23.479012 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates podName:ac5ef66c-66c6-44c0-87f4-b7edbac7dd85 nodeName:}" failed. No retries permitted until 2026-04-24 21:22:24.478998717 +0000 UTC m=+398.088137673 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates") pod "keda-metrics-apiserver-7c9f485588-jvzst" (UID: "ac5ef66c-66c6-44c0-87f4-b7edbac7dd85") : references non-existent secret key: tls.crt Apr 24 21:22:24.096926 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:24.096849 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tmffv" event={"ID":"dea1515b-0167-4e4b-8f70-6abd4be0a28e","Type":"ContainerStarted","Data":"19c253b47e95aa95770ffe73d5bac55dc473ad71e04a74915e97831d929d830a"} Apr 24 21:22:24.487438 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:24.487375 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:24.487933 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:24.487555 2581 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:22:24.487933 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:24.487580 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:22:24.487933 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:24.487611 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst: references non-existent secret key: tls.crt Apr 24 21:22:24.487933 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:24.487681 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates podName:ac5ef66c-66c6-44c0-87f4-b7edbac7dd85 nodeName:}" failed. No retries permitted until 2026-04-24 21:22:26.487659947 +0000 UTC m=+400.096798903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates") pod "keda-metrics-apiserver-7c9f485588-jvzst" (UID: "ac5ef66c-66c6-44c0-87f4-b7edbac7dd85") : references non-existent secret key: tls.crt Apr 24 21:22:25.101256 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:25.101175 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tmffv" event={"ID":"dea1515b-0167-4e4b-8f70-6abd4be0a28e","Type":"ContainerStarted","Data":"0931e4071c5c75e970777169ea8cc6f101c39d92488bf7dd6c53c1a56e57d341"} Apr 24 21:22:25.101414 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:25.101286 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:22:25.122047 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:25.122001 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-tmffv" podStartSLOduration=0.878302647 podStartE2EDuration="2.121986923s" podCreationTimestamp="2026-04-24 21:22:23 +0000 UTC" firstStartedPulling="2026-04-24 21:22:23.461794486 +0000 UTC m=+397.070933443" lastFinishedPulling="2026-04-24 21:22:24.705478761 +0000 UTC m=+398.314617719" observedRunningTime="2026-04-24 21:22:25.120592226 +0000 UTC m=+398.729731197" watchObservedRunningTime="2026-04-24 21:22:25.121986923 +0000 UTC m=+398.731125902" Apr 24 21:22:26.504472 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:26.504434 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:26.504866 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:26.504568 2581 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:22:26.504866 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:26.504586 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:22:26.504866 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:26.504602 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst: references non-existent secret key: tls.crt Apr 24 21:22:26.504866 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:22:26.504664 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates podName:ac5ef66c-66c6-44c0-87f4-b7edbac7dd85 nodeName:}" failed. No retries permitted until 2026-04-24 21:22:30.504648335 +0000 UTC m=+404.113787293 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates") pod "keda-metrics-apiserver-7c9f485588-jvzst" (UID: "ac5ef66c-66c6-44c0-87f4-b7edbac7dd85") : references non-existent secret key: tls.crt Apr 24 21:22:30.536018 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:30.535981 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:30.538424 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:30.538388 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac5ef66c-66c6-44c0-87f4-b7edbac7dd85-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jvzst\" (UID: \"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:30.805461 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:30.805305 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:30.918522 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:30.918492 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst"] Apr 24 21:22:30.922214 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:22:30.922187 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5ef66c_66c6_44c0_87f4_b7edbac7dd85.slice/crio-9ec2630ce12d4f4094e90fd82ed3a6ab97b82cbe7d460deff3921e99c6833b21 WatchSource:0}: Error finding container 9ec2630ce12d4f4094e90fd82ed3a6ab97b82cbe7d460deff3921e99c6833b21: Status 404 returned error can't find the container with id 9ec2630ce12d4f4094e90fd82ed3a6ab97b82cbe7d460deff3921e99c6833b21 Apr 24 21:22:31.117485 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:31.117379 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" event={"ID":"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85","Type":"ContainerStarted","Data":"9ec2630ce12d4f4094e90fd82ed3a6ab97b82cbe7d460deff3921e99c6833b21"} Apr 24 21:22:34.126456 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:34.126413 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" event={"ID":"ac5ef66c-66c6-44c0-87f4-b7edbac7dd85","Type":"ContainerStarted","Data":"245bea2fba9c93f82c7e5893844b5ec9fa8d4de836f98a3d7c13da4511df11cf"} Apr 24 21:22:34.126838 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:34.126524 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:34.143101 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:34.143052 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" podStartSLOduration=9.688587872 podStartE2EDuration="12.143037735s" podCreationTimestamp="2026-04-24 21:22:22 +0000 UTC" firstStartedPulling="2026-04-24 21:22:30.923386241 +0000 UTC m=+404.532525201" lastFinishedPulling="2026-04-24 21:22:33.377836106 +0000 UTC m=+406.986975064" observedRunningTime="2026-04-24 21:22:34.141825687 +0000 UTC m=+407.750964666" watchObservedRunningTime="2026-04-24 21:22:34.143037735 +0000 UTC m=+407.752176727" Apr 24 21:22:43.095110 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:43.095079 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-27zdd" Apr 24 21:22:45.134325 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:45.134294 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jvzst" Apr 24 21:22:46.105736 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:22:46.105707 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-tmffv" Apr 24 21:23:32.461949 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.461914 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-tgpzm"] Apr 24 21:23:32.468275 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.468251 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:23:32.471548 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.471525 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:23:32.475574 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.475555 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-tgpzm"] Apr 24 21:23:32.475929 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.475916 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-d7l67\"" Apr 24 21:23:32.476174 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.476161 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:23:32.476255 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.476237 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:23:32.574976 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.574933 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7r64\" (UniqueName: \"kubernetes.io/projected/b8382f57-de8b-494c-9d57-d6a7172d1123-kube-api-access-h7r64\") pod \"seaweedfs-86cc847c5c-tgpzm\" (UID: \"b8382f57-de8b-494c-9d57-d6a7172d1123\") " pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:23:32.575181 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.574989 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b8382f57-de8b-494c-9d57-d6a7172d1123-data\") pod \"seaweedfs-86cc847c5c-tgpzm\" (UID: \"b8382f57-de8b-494c-9d57-d6a7172d1123\") " pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:23:32.676239 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.676184 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7r64\" (UniqueName: \"kubernetes.io/projected/b8382f57-de8b-494c-9d57-d6a7172d1123-kube-api-access-h7r64\") pod \"seaweedfs-86cc847c5c-tgpzm\" (UID: \"b8382f57-de8b-494c-9d57-d6a7172d1123\") " pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:23:32.676440 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.676347 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b8382f57-de8b-494c-9d57-d6a7172d1123-data\") pod \"seaweedfs-86cc847c5c-tgpzm\" (UID: \"b8382f57-de8b-494c-9d57-d6a7172d1123\") " pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:23:32.676736 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.676713 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b8382f57-de8b-494c-9d57-d6a7172d1123-data\") pod \"seaweedfs-86cc847c5c-tgpzm\" (UID: \"b8382f57-de8b-494c-9d57-d6a7172d1123\") " pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:23:32.685343 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.685320 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7r64\" (UniqueName: \"kubernetes.io/projected/b8382f57-de8b-494c-9d57-d6a7172d1123-kube-api-access-h7r64\") pod \"seaweedfs-86cc847c5c-tgpzm\" (UID: \"b8382f57-de8b-494c-9d57-d6a7172d1123\") " pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:23:32.777477 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.777365 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:23:32.899997 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:32.899961 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-tgpzm"] Apr 24 21:23:32.903858 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:23:32.903825 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8382f57_de8b_494c_9d57_d6a7172d1123.slice/crio-b16a7d5114fb191604ab1a4e6ec32b76b9a2a9ad09cfccbd75b14854c004fc12 WatchSource:0}: Error finding container b16a7d5114fb191604ab1a4e6ec32b76b9a2a9ad09cfccbd75b14854c004fc12: Status 404 returned error can't find the container with id b16a7d5114fb191604ab1a4e6ec32b76b9a2a9ad09cfccbd75b14854c004fc12 Apr 24 21:23:33.290639 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:33.290597 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-tgpzm" event={"ID":"b8382f57-de8b-494c-9d57-d6a7172d1123","Type":"ContainerStarted","Data":"b16a7d5114fb191604ab1a4e6ec32b76b9a2a9ad09cfccbd75b14854c004fc12"} Apr 24 21:23:36.301576 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:36.301486 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-tgpzm" event={"ID":"b8382f57-de8b-494c-9d57-d6a7172d1123","Type":"ContainerStarted","Data":"3f370790eeb9137ef75889f752111535a12634a505acb30e1dc5925bd06154d7"} Apr 24 21:23:36.301576 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:36.301544 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:23:36.320335 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:36.320285 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-tgpzm" podStartSLOduration=1.180765684 podStartE2EDuration="4.32027004s" podCreationTimestamp="2026-04-24 21:23:32 +0000 UTC" firstStartedPulling="2026-04-24 21:23:32.905083625 +0000 UTC m=+466.514222582" lastFinishedPulling="2026-04-24 21:23:36.044587968 +0000 UTC m=+469.653726938" observedRunningTime="2026-04-24 21:23:36.319173932 +0000 UTC m=+469.928312913" watchObservedRunningTime="2026-04-24 21:23:36.32027004 +0000 UTC m=+469.929409019" Apr 24 21:23:42.306198 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:23:42.306165 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-tgpzm" Apr 24 21:24:42.015105 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.015023 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-6jpsh"] Apr 24 21:24:42.017980 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.017963 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:24:42.020335 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.020315 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:24:42.020454 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.020361 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-vf24f\"" Apr 24 21:24:42.027250 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.027218 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-6jpsh"] Apr 24 21:24:42.071758 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.071730 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39aea3ba-1dc8-464a-a346-3a8d195c2d2d-tls-certs\") pod \"model-serving-api-86f7b4b499-6jpsh\" (UID: \"39aea3ba-1dc8-464a-a346-3a8d195c2d2d\") " pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:24:42.071909 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.071784 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxh2k\" (UniqueName: \"kubernetes.io/projected/39aea3ba-1dc8-464a-a346-3a8d195c2d2d-kube-api-access-kxh2k\") pod \"model-serving-api-86f7b4b499-6jpsh\" (UID: \"39aea3ba-1dc8-464a-a346-3a8d195c2d2d\") " pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:24:42.172692 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.172653 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxh2k\" (UniqueName: \"kubernetes.io/projected/39aea3ba-1dc8-464a-a346-3a8d195c2d2d-kube-api-access-kxh2k\") pod \"model-serving-api-86f7b4b499-6jpsh\" (UID: \"39aea3ba-1dc8-464a-a346-3a8d195c2d2d\") " pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:24:42.172889 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.172731 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39aea3ba-1dc8-464a-a346-3a8d195c2d2d-tls-certs\") pod \"model-serving-api-86f7b4b499-6jpsh\" (UID: \"39aea3ba-1dc8-464a-a346-3a8d195c2d2d\") " pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:24:42.175141 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.175118 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39aea3ba-1dc8-464a-a346-3a8d195c2d2d-tls-certs\") pod \"model-serving-api-86f7b4b499-6jpsh\" (UID: \"39aea3ba-1dc8-464a-a346-3a8d195c2d2d\") " pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:24:42.181680 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.181657 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxh2k\" (UniqueName: \"kubernetes.io/projected/39aea3ba-1dc8-464a-a346-3a8d195c2d2d-kube-api-access-kxh2k\") pod \"model-serving-api-86f7b4b499-6jpsh\" (UID: \"39aea3ba-1dc8-464a-a346-3a8d195c2d2d\") " pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:24:42.328065 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.327986 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:24:42.442672 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.442640 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-6jpsh"] Apr 24 21:24:42.445631 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:24:42.445598 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39aea3ba_1dc8_464a_a346_3a8d195c2d2d.slice/crio-6195645c0d41ac33fb99a34ec4e21921c0349f0a5752ac53c30349e36c549b83 WatchSource:0}: Error finding container 6195645c0d41ac33fb99a34ec4e21921c0349f0a5752ac53c30349e36c549b83: Status 404 returned error can't find the container with id 6195645c0d41ac33fb99a34ec4e21921c0349f0a5752ac53c30349e36c549b83 Apr 24 21:24:42.490304 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:42.490270 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-6jpsh" event={"ID":"39aea3ba-1dc8-464a-a346-3a8d195c2d2d","Type":"ContainerStarted","Data":"6195645c0d41ac33fb99a34ec4e21921c0349f0a5752ac53c30349e36c549b83"} Apr 24 21:24:45.501918 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:45.501875 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-6jpsh" event={"ID":"39aea3ba-1dc8-464a-a346-3a8d195c2d2d","Type":"ContainerStarted","Data":"46a698fd542c04608d307fe7855b921c4c53572a2b0387c7e173f448fb038e75"} Apr 24 21:24:45.502298 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:45.501994 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:24:45.524978 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:45.524934 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-6jpsh" podStartSLOduration=2.230169805 podStartE2EDuration="4.524919853s" podCreationTimestamp="2026-04-24 21:24:41 +0000 UTC" firstStartedPulling="2026-04-24 21:24:42.447354081 +0000 UTC m=+536.056493039" lastFinishedPulling="2026-04-24 21:24:44.742104111 +0000 UTC m=+538.351243087" observedRunningTime="2026-04-24 21:24:45.523103365 +0000 UTC m=+539.132242345" watchObservedRunningTime="2026-04-24 21:24:45.524919853 +0000 UTC m=+539.134058831" Apr 24 21:24:56.509251 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:24:56.509223 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-6jpsh" Apr 24 21:25:08.962141 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:08.962103 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs"] Apr 24 21:25:08.965379 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:08.965358 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:08.967606 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:08.967579 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 21:25:08.972731 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:08.972710 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs"] Apr 24 21:25:09.070159 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:09.070129 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-zlwjs\" (UID: \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:09.070159 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:09.070164 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtt2f\" (UniqueName: \"kubernetes.io/projected/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-kube-api-access-dtt2f\") pod \"seaweedfs-tls-custom-ddd4dbfd-zlwjs\" (UID: \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:09.171474 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:09.171427 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-zlwjs\" (UID: \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:09.171474 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:09.171474 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtt2f\" (UniqueName: \"kubernetes.io/projected/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-kube-api-access-dtt2f\") pod \"seaweedfs-tls-custom-ddd4dbfd-zlwjs\" (UID: \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:09.171881 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:09.171861 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-zlwjs\" (UID: \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:09.179830 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:09.179803 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtt2f\" (UniqueName: \"kubernetes.io/projected/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-kube-api-access-dtt2f\") pod \"seaweedfs-tls-custom-ddd4dbfd-zlwjs\" (UID: \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:09.274890 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:09.274793 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:09.390928 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:09.390895 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs"] Apr 24 21:25:09.393959 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:25:09.393929 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6d3c753_47ec_4891_a75c_ecf59c5c1b1d.slice/crio-0f6287ae4a3299414b23b368ef1505433100d3205ebca8879461b4f9029c5608 WatchSource:0}: Error finding container 0f6287ae4a3299414b23b368ef1505433100d3205ebca8879461b4f9029c5608: Status 404 returned error can't find the container with id 0f6287ae4a3299414b23b368ef1505433100d3205ebca8879461b4f9029c5608 Apr 24 21:25:09.565265 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:09.565181 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" event={"ID":"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d","Type":"ContainerStarted","Data":"0f6287ae4a3299414b23b368ef1505433100d3205ebca8879461b4f9029c5608"} Apr 24 21:25:10.569477 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:10.569439 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" event={"ID":"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d","Type":"ContainerStarted","Data":"d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237"} Apr 24 21:25:10.583878 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:10.583817 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" podStartSLOduration=2.304777468 podStartE2EDuration="2.583800157s" podCreationTimestamp="2026-04-24 21:25:08 +0000 UTC" firstStartedPulling="2026-04-24 21:25:09.39532464 +0000 UTC m=+563.004463605" lastFinishedPulling="2026-04-24 21:25:09.674347337 +0000 UTC m=+563.283486294" observedRunningTime="2026-04-24 21:25:10.583019157 +0000 UTC m=+564.192158139" watchObservedRunningTime="2026-04-24 21:25:10.583800157 +0000 UTC m=+564.192939136" Apr 24 21:25:11.506865 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:11.506822 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs"] Apr 24 21:25:12.574698 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:12.574653 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" podUID="a6d3c753-47ec-4891-a75c-ecf59c5c1b1d" containerName="seaweedfs-tls-custom" containerID="cri-o://d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237" gracePeriod=30 Apr 24 21:25:13.809649 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:13.809628 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:13.909053 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:13.909021 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtt2f\" (UniqueName: \"kubernetes.io/projected/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-kube-api-access-dtt2f\") pod \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\" (UID: \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\") " Apr 24 21:25:13.909203 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:13.909071 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-data\") pod \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\" (UID: \"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d\") " Apr 24 21:25:13.910352 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:13.910326 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-data" (OuterVolumeSpecName: "data") pod "a6d3c753-47ec-4891-a75c-ecf59c5c1b1d" (UID: "a6d3c753-47ec-4891-a75c-ecf59c5c1b1d"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:25:13.911162 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:13.911136 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-kube-api-access-dtt2f" (OuterVolumeSpecName: "kube-api-access-dtt2f") pod "a6d3c753-47ec-4891-a75c-ecf59c5c1b1d" (UID: "a6d3c753-47ec-4891-a75c-ecf59c5c1b1d"). InnerVolumeSpecName "kube-api-access-dtt2f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:25:14.010535 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.010486 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dtt2f\" (UniqueName: \"kubernetes.io/projected/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-kube-api-access-dtt2f\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:25:14.010535 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.010528 2581 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d-data\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:25:14.581458 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.581422 2581 generic.go:358] "Generic (PLEG): container finished" podID="a6d3c753-47ec-4891-a75c-ecf59c5c1b1d" containerID="d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237" exitCode=0 Apr 24 21:25:14.581643 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.581483 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" event={"ID":"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d","Type":"ContainerDied","Data":"d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237"} Apr 24 21:25:14.581643 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.581516 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" event={"ID":"a6d3c753-47ec-4891-a75c-ecf59c5c1b1d","Type":"ContainerDied","Data":"0f6287ae4a3299414b23b368ef1505433100d3205ebca8879461b4f9029c5608"} Apr 24 21:25:14.581643 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.581535 2581 scope.go:117] "RemoveContainer" containerID="d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237" Apr 24 21:25:14.581643 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.581535 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs" Apr 24 21:25:14.591210 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.591193 2581 scope.go:117] "RemoveContainer" containerID="d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237" Apr 24 21:25:14.591496 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:25:14.591470 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237\": container with ID starting with d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237 not found: ID does not exist" containerID="d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237" Apr 24 21:25:14.591546 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.591508 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237"} err="failed to get container status \"d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237\": rpc error: code = NotFound desc = could not find container \"d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237\": container with ID starting with d10fc44942aef000351ecce576f275e7b4b7f3975cfd471c77c58fd6f5fbd237 not found: ID does not exist" Apr 24 21:25:14.602683 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.602662 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs"] Apr 24 21:25:14.607626 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.607605 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-zlwjs"] Apr 24 21:25:14.633674 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.633642 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc"] Apr 24 21:25:14.633900 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.633889 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6d3c753-47ec-4891-a75c-ecf59c5c1b1d" containerName="seaweedfs-tls-custom" Apr 24 21:25:14.633946 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.633902 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d3c753-47ec-4891-a75c-ecf59c5c1b1d" containerName="seaweedfs-tls-custom" Apr 24 21:25:14.633978 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.633953 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6d3c753-47ec-4891-a75c-ecf59c5c1b1d" containerName="seaweedfs-tls-custom" Apr 24 21:25:14.638151 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.638132 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.640405 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.640375 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 24 21:25:14.640529 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.640438 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 21:25:14.646177 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.646153 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc"] Apr 24 21:25:14.716603 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.716572 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/2dc277b4-d4f3-4907-a8e7-f21637b0faf4-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-67jrc\" (UID: \"2dc277b4-d4f3-4907-a8e7-f21637b0faf4\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.716603 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.716608 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn2nx\" (UniqueName: \"kubernetes.io/projected/2dc277b4-d4f3-4907-a8e7-f21637b0faf4-kube-api-access-bn2nx\") pod \"seaweedfs-tls-custom-5c88b85bb7-67jrc\" (UID: \"2dc277b4-d4f3-4907-a8e7-f21637b0faf4\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.716826 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.716643 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2dc277b4-d4f3-4907-a8e7-f21637b0faf4-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-67jrc\" (UID: \"2dc277b4-d4f3-4907-a8e7-f21637b0faf4\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.818069 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.818031 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2dc277b4-d4f3-4907-a8e7-f21637b0faf4-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-67jrc\" (UID: \"2dc277b4-d4f3-4907-a8e7-f21637b0faf4\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.818478 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.818121 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/2dc277b4-d4f3-4907-a8e7-f21637b0faf4-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-67jrc\" (UID: \"2dc277b4-d4f3-4907-a8e7-f21637b0faf4\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.818478 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.818153 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn2nx\" (UniqueName: \"kubernetes.io/projected/2dc277b4-d4f3-4907-a8e7-f21637b0faf4-kube-api-access-bn2nx\") pod \"seaweedfs-tls-custom-5c88b85bb7-67jrc\" (UID: \"2dc277b4-d4f3-4907-a8e7-f21637b0faf4\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.818562 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.818494 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2dc277b4-d4f3-4907-a8e7-f21637b0faf4-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-67jrc\" (UID: \"2dc277b4-d4f3-4907-a8e7-f21637b0faf4\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.820540 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.820521 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/2dc277b4-d4f3-4907-a8e7-f21637b0faf4-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-67jrc\" (UID: \"2dc277b4-d4f3-4907-a8e7-f21637b0faf4\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.826605 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.826582 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn2nx\" (UniqueName: \"kubernetes.io/projected/2dc277b4-d4f3-4907-a8e7-f21637b0faf4-kube-api-access-bn2nx\") pod \"seaweedfs-tls-custom-5c88b85bb7-67jrc\" (UID: \"2dc277b4-d4f3-4907-a8e7-f21637b0faf4\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.947075 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.947045 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" Apr 24 21:25:14.949462 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:14.949434 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d3c753-47ec-4891-a75c-ecf59c5c1b1d" path="/var/lib/kubelet/pods/a6d3c753-47ec-4891-a75c-ecf59c5c1b1d/volumes" Apr 24 21:25:15.063327 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:15.063291 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc"] Apr 24 21:25:15.066555 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:25:15.066526 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dc277b4_d4f3_4907_a8e7_f21637b0faf4.slice/crio-e95735d1fe4c7f0347d0edc8019c4ed3f3da3903f3b00f83dc6bb316618abc73 WatchSource:0}: Error finding container e95735d1fe4c7f0347d0edc8019c4ed3f3da3903f3b00f83dc6bb316618abc73: Status 404 returned error can't find the container with id e95735d1fe4c7f0347d0edc8019c4ed3f3da3903f3b00f83dc6bb316618abc73 Apr 24 21:25:15.585927 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:15.585837 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" event={"ID":"2dc277b4-d4f3-4907-a8e7-f21637b0faf4","Type":"ContainerStarted","Data":"cb93a071fa1c5a29073c44e1ed595915d929efc94010e0679df910939fe694f6"} Apr 24 21:25:15.585927 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:15.585869 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" event={"ID":"2dc277b4-d4f3-4907-a8e7-f21637b0faf4","Type":"ContainerStarted","Data":"e95735d1fe4c7f0347d0edc8019c4ed3f3da3903f3b00f83dc6bb316618abc73"} Apr 24 21:25:15.603918 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:15.603874 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-67jrc" podStartSLOduration=1.350453932 podStartE2EDuration="1.60386009s" podCreationTimestamp="2026-04-24 21:25:14 +0000 UTC" firstStartedPulling="2026-04-24 21:25:15.067819023 +0000 UTC m=+568.676957979" lastFinishedPulling="2026-04-24 21:25:15.32122517 +0000 UTC m=+568.930364137" observedRunningTime="2026-04-24 21:25:15.601693809 +0000 UTC m=+569.210832788" watchObservedRunningTime="2026-04-24 21:25:15.60386009 +0000 UTC m=+569.212999068" Apr 24 21:25:23.917933 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:23.917898 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv"] Apr 24 21:25:23.922034 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:23.922014 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:23.924174 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:23.924152 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 24 21:25:23.924418 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:23.924376 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 21:25:23.929844 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:23.929821 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv"] Apr 24 21:25:23.987775 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:23.987739 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:23.987943 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:23.987792 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvxft\" (UniqueName: \"kubernetes.io/projected/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-kube-api-access-fvxft\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:23.987943 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:23.987842 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-data\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:24.088660 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:24.088575 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-data\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:24.088660 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:24.088638 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:24.088842 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:24.088687 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvxft\" (UniqueName: \"kubernetes.io/projected/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-kube-api-access-fvxft\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:24.088842 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:25:24.088789 2581 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 24 21:25:24.088842 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:25:24.088808 2581 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv: secret "seaweedfs-tls-serving" not found Apr 24 21:25:24.088947 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:25:24.088870 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-seaweedfs-tls-serving podName:f3d898ad-62e4-4f11-a1a6-a8763745ea7f nodeName:}" failed. No retries permitted until 2026-04-24 21:25:24.588853333 +0000 UTC m=+578.197992291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-qgdgv" (UID: "f3d898ad-62e4-4f11-a1a6-a8763745ea7f") : secret "seaweedfs-tls-serving" not found Apr 24 21:25:24.088996 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:24.088978 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-data\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:24.098640 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:24.098613 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvxft\" (UniqueName: \"kubernetes.io/projected/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-kube-api-access-fvxft\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:24.593691 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:24.593642 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:24.596015 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:24.595985 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f3d898ad-62e4-4f11-a1a6-a8763745ea7f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-qgdgv\" (UID: \"f3d898ad-62e4-4f11-a1a6-a8763745ea7f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:24.831365 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:24.831329 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" Apr 24 21:25:24.949656 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:24.949633 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv"] Apr 24 21:25:24.951133 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:25:24.951107 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3d898ad_62e4_4f11_a1a6_a8763745ea7f.slice/crio-71b940295228a122836703f271475f01e33c98989ed16bfb41d58abdc89e1ba0 WatchSource:0}: Error finding container 71b940295228a122836703f271475f01e33c98989ed16bfb41d58abdc89e1ba0: Status 404 returned error can't find the container with id 71b940295228a122836703f271475f01e33c98989ed16bfb41d58abdc89e1ba0 Apr 24 21:25:25.618143 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:25.618055 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" event={"ID":"f3d898ad-62e4-4f11-a1a6-a8763745ea7f","Type":"ContainerStarted","Data":"c32add164d3dc1f3dbc62c2ead619d4bf071c37127f7f509efd0fc64b16a82e9"} Apr 24 21:25:25.618143 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:25.618098 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" event={"ID":"f3d898ad-62e4-4f11-a1a6-a8763745ea7f","Type":"ContainerStarted","Data":"71b940295228a122836703f271475f01e33c98989ed16bfb41d58abdc89e1ba0"} Apr 24 21:25:25.649461 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:25.649408 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-qgdgv" podStartSLOduration=2.386320786 podStartE2EDuration="2.649375907s" podCreationTimestamp="2026-04-24 21:25:23 +0000 UTC" firstStartedPulling="2026-04-24 21:25:24.952233406 +0000 UTC m=+578.561372364" lastFinishedPulling="2026-04-24 21:25:25.215288529 +0000 UTC m=+578.824427485" observedRunningTime="2026-04-24 21:25:25.648194462 +0000 UTC m=+579.257333454" watchObservedRunningTime="2026-04-24 21:25:25.649375907 +0000 UTC m=+579.258514886" Apr 24 21:25:46.889619 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:46.889593 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:25:46.889933 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:25:46.889717 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:30:46.906830 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:30:46.906799 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:30:46.907984 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:30:46.907960 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:35:46.924428 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:35:46.924382 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:35:46.926198 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:35:46.926178 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:40:46.947717 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:40:46.947686 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:40:46.949629 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:40:46.949607 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:43:35.594778 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.594728 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66"] Apr 24 21:43:35.598189 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.598167 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.600449 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.600427 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:43:35.600564 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.600459 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 21:43:35.600564 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.600433 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:43:35.601068 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.601035 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:43:35.601068 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.601047 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 24 21:43:35.608334 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.608315 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66"] Apr 24 21:43:35.677493 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.677460 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.677665 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.677515 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p56d2\" (UniqueName: \"kubernetes.io/projected/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kube-api-access-p56d2\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.677665 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.677597 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.677665 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.677635 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.778786 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.778742 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.778965 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.778797 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.778965 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.778832 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p56d2\" (UniqueName: \"kubernetes.io/projected/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kube-api-access-p56d2\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.778965 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.778865 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.779080 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:43:35.778976 2581 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-runtime-predictor-serving-cert: secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 24 21:43:35.779080 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:43:35.779039 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-proxy-tls podName:f17ab9b5-1afd-4e58-acd0-f4b7342419e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:43:36.279024282 +0000 UTC m=+1669.888163244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-proxy-tls") pod "isvc-pmml-runtime-predictor-67bc544947-bck66" (UID: "f17ab9b5-1afd-4e58-acd0-f4b7342419e7") : secret "isvc-pmml-runtime-predictor-serving-cert" not found Apr 24 21:43:35.779293 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.779269 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.779538 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.779522 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:35.787215 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:35.787192 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p56d2\" (UniqueName: \"kubernetes.io/projected/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kube-api-access-p56d2\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:36.284143 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:36.284112 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:36.286422 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:36.286385 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-bck66\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:36.508924 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:36.508884 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:36.631443 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:36.631388 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66"] Apr 24 21:43:36.634622 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:43:36.634583 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf17ab9b5_1afd_4e58_acd0_f4b7342419e7.slice/crio-7f36640a1b1e4ff7b51b1d01d8015038c6355c44d9e7670cd227fd1276c06970 WatchSource:0}: Error finding container 7f36640a1b1e4ff7b51b1d01d8015038c6355c44d9e7670cd227fd1276c06970: Status 404 returned error can't find the container with id 7f36640a1b1e4ff7b51b1d01d8015038c6355c44d9e7670cd227fd1276c06970 Apr 24 21:43:36.636493 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:36.636475 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:43:37.600683 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:37.600642 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" event={"ID":"f17ab9b5-1afd-4e58-acd0-f4b7342419e7","Type":"ContainerStarted","Data":"7f36640a1b1e4ff7b51b1d01d8015038c6355c44d9e7670cd227fd1276c06970"} Apr 24 21:43:40.610926 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:40.610887 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" event={"ID":"f17ab9b5-1afd-4e58-acd0-f4b7342419e7","Type":"ContainerStarted","Data":"4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c"} Apr 24 21:43:44.623097 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:44.623064 2581 generic.go:358] "Generic (PLEG): container finished" podID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerID="4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c" exitCode=0 Apr 24 21:43:44.623488 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:44.623131 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" event={"ID":"f17ab9b5-1afd-4e58-acd0-f4b7342419e7","Type":"ContainerDied","Data":"4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c"} Apr 24 21:43:51.648883 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:51.648848 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" event={"ID":"f17ab9b5-1afd-4e58-acd0-f4b7342419e7","Type":"ContainerStarted","Data":"6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a"} Apr 24 21:43:54.660951 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:54.660913 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" event={"ID":"f17ab9b5-1afd-4e58-acd0-f4b7342419e7","Type":"ContainerStarted","Data":"aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d"} Apr 24 21:43:54.661330 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:54.661088 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:54.679592 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:54.679530 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podStartSLOduration=2.574816803 podStartE2EDuration="19.67951641s" podCreationTimestamp="2026-04-24 21:43:35 +0000 UTC" firstStartedPulling="2026-04-24 21:43:36.636666087 +0000 UTC m=+1670.245805045" lastFinishedPulling="2026-04-24 21:43:53.741365679 +0000 UTC m=+1687.350504652" observedRunningTime="2026-04-24 21:43:54.678283283 +0000 UTC m=+1688.287422261" watchObservedRunningTime="2026-04-24 21:43:54.67951641 +0000 UTC m=+1688.288655390" Apr 24 21:43:55.663981 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:55.663955 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:43:55.665187 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:55.665158 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:43:56.666334 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:43:56.666287 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:44:01.670747 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:44:01.670716 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:44:01.671197 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:44:01.671169 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:44:11.671785 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:44:11.671699 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:44:21.671765 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:44:21.671721 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:44:31.671934 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:44:31.671896 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:44:41.671330 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:44:41.671291 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:44:51.671437 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:44:51.671378 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:45:01.671999 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:01.671953 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:45:11.672290 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:11.672256 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:45:17.001319 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.001286 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66"] Apr 24 21:45:17.001816 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.001689 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" containerID="cri-o://6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a" gracePeriod=30 Apr 24 21:45:17.001816 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.001786 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kube-rbac-proxy" containerID="cri-o://aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d" gracePeriod=30 Apr 24 21:45:17.112667 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.112631 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs"] Apr 24 21:45:17.116008 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.115985 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.118192 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.118166 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 24 21:45:17.118296 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.118278 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 21:45:17.125852 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.125829 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs"] Apr 24 21:45:17.281319 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.281242 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb399466-61d7-4dbe-a5af-e6e773d8a378-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.281319 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.281282 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdplv\" (UniqueName: \"kubernetes.io/projected/fb399466-61d7-4dbe-a5af-e6e773d8a378-kube-api-access-hdplv\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.281514 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.281327 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb399466-61d7-4dbe-a5af-e6e773d8a378-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.281514 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.281357 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb399466-61d7-4dbe-a5af-e6e773d8a378-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.382377 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.382332 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb399466-61d7-4dbe-a5af-e6e773d8a378-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.382377 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.382380 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdplv\" (UniqueName: \"kubernetes.io/projected/fb399466-61d7-4dbe-a5af-e6e773d8a378-kube-api-access-hdplv\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.382654 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.382437 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb399466-61d7-4dbe-a5af-e6e773d8a378-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.382654 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.382478 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb399466-61d7-4dbe-a5af-e6e773d8a378-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.382927 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.382906 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb399466-61d7-4dbe-a5af-e6e773d8a378-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.382998 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.382972 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb399466-61d7-4dbe-a5af-e6e773d8a378-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.384811 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.384792 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb399466-61d7-4dbe-a5af-e6e773d8a378-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.390152 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.390130 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdplv\" (UniqueName: \"kubernetes.io/projected/fb399466-61d7-4dbe-a5af-e6e773d8a378-kube-api-access-hdplv\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.426041 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.426009 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:17.544575 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.544495 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs"] Apr 24 21:45:17.547432 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:45:17.547385 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb399466_61d7_4dbe_a5af_e6e773d8a378.slice/crio-d4f8e79f44810637619b9be016733644f250f7184bcf520653e9c7370c2f4f0c WatchSource:0}: Error finding container d4f8e79f44810637619b9be016733644f250f7184bcf520653e9c7370c2f4f0c: Status 404 returned error can't find the container with id d4f8e79f44810637619b9be016733644f250f7184bcf520653e9c7370c2f4f0c Apr 24 21:45:17.894704 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.894660 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" event={"ID":"fb399466-61d7-4dbe-a5af-e6e773d8a378","Type":"ContainerStarted","Data":"87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b"} Apr 24 21:45:17.894704 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.894708 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" event={"ID":"fb399466-61d7-4dbe-a5af-e6e773d8a378","Type":"ContainerStarted","Data":"d4f8e79f44810637619b9be016733644f250f7184bcf520653e9c7370c2f4f0c"} Apr 24 21:45:17.896475 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.896449 2581 generic.go:358] "Generic (PLEG): container finished" podID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerID="aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d" exitCode=2 Apr 24 21:45:17.896593 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:17.896488 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" event={"ID":"f17ab9b5-1afd-4e58-acd0-f4b7342419e7","Type":"ContainerDied","Data":"aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d"} Apr 24 21:45:20.650095 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.650068 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:45:20.809730 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.809649 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kserve-provision-location\") pod \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " Apr 24 21:45:20.809730 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.809691 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " Apr 24 21:45:20.809730 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.809717 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-proxy-tls\") pod \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " Apr 24 21:45:20.810005 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.809750 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p56d2\" (UniqueName: \"kubernetes.io/projected/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kube-api-access-p56d2\") pod \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\" (UID: \"f17ab9b5-1afd-4e58-acd0-f4b7342419e7\") " Apr 24 21:45:20.810098 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.810065 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f17ab9b5-1afd-4e58-acd0-f4b7342419e7" (UID: "f17ab9b5-1afd-4e58-acd0-f4b7342419e7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:20.810218 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.810074 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "f17ab9b5-1afd-4e58-acd0-f4b7342419e7" (UID: "f17ab9b5-1afd-4e58-acd0-f4b7342419e7"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:20.812535 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.812506 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f17ab9b5-1afd-4e58-acd0-f4b7342419e7" (UID: "f17ab9b5-1afd-4e58-acd0-f4b7342419e7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:20.812535 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.812517 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kube-api-access-p56d2" (OuterVolumeSpecName: "kube-api-access-p56d2") pod "f17ab9b5-1afd-4e58-acd0-f4b7342419e7" (UID: "f17ab9b5-1afd-4e58-acd0-f4b7342419e7"). InnerVolumeSpecName "kube-api-access-p56d2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:20.907986 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.907950 2581 generic.go:358] "Generic (PLEG): container finished" podID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerID="6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a" exitCode=0 Apr 24 21:45:20.908141 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.908007 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" event={"ID":"f17ab9b5-1afd-4e58-acd0-f4b7342419e7","Type":"ContainerDied","Data":"6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a"} Apr 24 21:45:20.908141 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.908038 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" Apr 24 21:45:20.908141 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.908051 2581 scope.go:117] "RemoveContainer" containerID="aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d" Apr 24 21:45:20.908288 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.908039 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66" event={"ID":"f17ab9b5-1afd-4e58-acd0-f4b7342419e7","Type":"ContainerDied","Data":"7f36640a1b1e4ff7b51b1d01d8015038c6355c44d9e7670cd227fd1276c06970"} Apr 24 21:45:20.910520 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.910496 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:45:20.910595 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.910528 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:45:20.910595 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.910543 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:45:20.910595 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.910558 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p56d2\" (UniqueName: \"kubernetes.io/projected/f17ab9b5-1afd-4e58-acd0-f4b7342419e7-kube-api-access-p56d2\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:45:20.915992 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.915975 2581 scope.go:117] "RemoveContainer" containerID="6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a" Apr 24 21:45:20.923177 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.923159 2581 scope.go:117] "RemoveContainer" containerID="4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c" Apr 24 21:45:20.928186 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.928167 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66"] Apr 24 21:45:20.930344 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.930330 2581 scope.go:117] "RemoveContainer" containerID="aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d" Apr 24 21:45:20.930830 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:45:20.930809 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d\": container with ID starting with aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d not found: ID does not exist" containerID="aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d" Apr 24 21:45:20.930930 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.930838 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d"} err="failed to get container status \"aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d\": rpc error: code = NotFound desc = could not find container \"aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d\": container with ID starting with aec88f88439262115ad2944345610b802899d5d5807aea8028ca1c26e8776a9d not found: ID does not exist" Apr 24 21:45:20.930930 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.930856 2581 scope.go:117] "RemoveContainer" containerID="6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a" Apr 24 21:45:20.931111 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:45:20.931089 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a\": container with ID starting with 6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a not found: ID does not exist" containerID="6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a" Apr 24 21:45:20.931175 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.931118 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a"} err="failed to get container status \"6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a\": rpc error: code = NotFound desc = could not find container \"6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a\": container with ID starting with 6372f981ecfc2a23e0b8c194ac58ace10f332080e03936a029fcfedc111ae82a not found: ID does not exist" Apr 24 21:45:20.931175 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.931135 2581 scope.go:117] "RemoveContainer" containerID="4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c" Apr 24 21:45:20.931336 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:45:20.931317 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c\": container with ID starting with 4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c not found: ID does not exist" containerID="4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c" Apr 24 21:45:20.931511 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.931342 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c"} err="failed to get container status \"4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c\": rpc error: code = NotFound desc = could not find container \"4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c\": container with ID starting with 4af0361ffd1760d9bc3df1cedb61dd31b87e076e3b65f97c6ae76a0d87a9026c not found: ID does not exist" Apr 24 21:45:20.932166 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.932145 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-bck66"] Apr 24 21:45:20.949238 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:20.949216 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" path="/var/lib/kubelet/pods/f17ab9b5-1afd-4e58-acd0-f4b7342419e7/volumes" Apr 24 21:45:21.913045 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:21.913006 2581 generic.go:358] "Generic (PLEG): container finished" podID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerID="87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b" exitCode=0 Apr 24 21:45:21.913488 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:21.913066 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" event={"ID":"fb399466-61d7-4dbe-a5af-e6e773d8a378","Type":"ContainerDied","Data":"87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b"} Apr 24 21:45:22.917827 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:22.917790 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" event={"ID":"fb399466-61d7-4dbe-a5af-e6e773d8a378","Type":"ContainerStarted","Data":"9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827"} Apr 24 21:45:22.918277 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:22.917837 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" event={"ID":"fb399466-61d7-4dbe-a5af-e6e773d8a378","Type":"ContainerStarted","Data":"2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df"} Apr 24 21:45:22.918277 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:22.918020 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:22.940304 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:22.940255 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podStartSLOduration=5.940241208 podStartE2EDuration="5.940241208s" podCreationTimestamp="2026-04-24 21:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:22.938295992 +0000 UTC m=+1776.547434971" watchObservedRunningTime="2026-04-24 21:45:22.940241208 +0000 UTC m=+1776.549380186" Apr 24 21:45:23.921264 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:23.921233 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:23.922364 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:23.922341 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:45:24.924667 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:24.924627 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:45:29.928463 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:29.928429 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:45:29.929075 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:29.929045 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:45:39.929677 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:39.929636 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:45:46.965634 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:46.965605 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:45:46.968092 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:46.968069 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:45:49.928972 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:49.928934 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:45:59.929358 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:45:59.929318 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:46:09.929740 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:09.929701 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:46:19.928985 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:19.928943 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:46:29.929008 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:29.928958 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:46:39.929197 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:39.929157 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:46:49.929561 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:49.929528 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:46:58.344608 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:58.344572 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs"] Apr 24 21:46:58.345079 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:58.344903 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" containerID="cri-o://2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df" gracePeriod=30 Apr 24 21:46:58.345079 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:58.344923 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kube-rbac-proxy" containerID="cri-o://9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827" gracePeriod=30 Apr 24 21:46:59.202681 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:59.202649 2581 generic.go:358] "Generic (PLEG): container finished" podID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerID="9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827" exitCode=2 Apr 24 21:46:59.202858 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:59.202694 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" event={"ID":"fb399466-61d7-4dbe-a5af-e6e773d8a378","Type":"ContainerDied","Data":"9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827"} Apr 24 21:46:59.925331 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:59.925289 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.22:8643/healthz\": dial tcp 10.133.0.22:8643: connect: connection refused" Apr 24 21:46:59.929553 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:46:59.929518 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:47:01.882453 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:01.882429 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:47:01.996752 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:01.996646 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdplv\" (UniqueName: \"kubernetes.io/projected/fb399466-61d7-4dbe-a5af-e6e773d8a378-kube-api-access-hdplv\") pod \"fb399466-61d7-4dbe-a5af-e6e773d8a378\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " Apr 24 21:47:01.996752 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:01.996725 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb399466-61d7-4dbe-a5af-e6e773d8a378-kserve-provision-location\") pod \"fb399466-61d7-4dbe-a5af-e6e773d8a378\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " Apr 24 21:47:01.996752 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:01.996745 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb399466-61d7-4dbe-a5af-e6e773d8a378-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"fb399466-61d7-4dbe-a5af-e6e773d8a378\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " Apr 24 21:47:01.997001 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:01.996782 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb399466-61d7-4dbe-a5af-e6e773d8a378-proxy-tls\") pod \"fb399466-61d7-4dbe-a5af-e6e773d8a378\" (UID: \"fb399466-61d7-4dbe-a5af-e6e773d8a378\") " Apr 24 21:47:01.997058 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:01.997030 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb399466-61d7-4dbe-a5af-e6e773d8a378-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb399466-61d7-4dbe-a5af-e6e773d8a378" (UID: "fb399466-61d7-4dbe-a5af-e6e773d8a378"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:01.997121 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:01.997098 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb399466-61d7-4dbe-a5af-e6e773d8a378-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "fb399466-61d7-4dbe-a5af-e6e773d8a378" (UID: "fb399466-61d7-4dbe-a5af-e6e773d8a378"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:01.998846 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:01.998819 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb399466-61d7-4dbe-a5af-e6e773d8a378-kube-api-access-hdplv" (OuterVolumeSpecName: "kube-api-access-hdplv") pod "fb399466-61d7-4dbe-a5af-e6e773d8a378" (UID: "fb399466-61d7-4dbe-a5af-e6e773d8a378"). InnerVolumeSpecName "kube-api-access-hdplv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:01.998967 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:01.998939 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb399466-61d7-4dbe-a5af-e6e773d8a378-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fb399466-61d7-4dbe-a5af-e6e773d8a378" (UID: "fb399466-61d7-4dbe-a5af-e6e773d8a378"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:02.098013 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.097978 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb399466-61d7-4dbe-a5af-e6e773d8a378-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:47:02.098013 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.098007 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb399466-61d7-4dbe-a5af-e6e773d8a378-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:47:02.098013 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.098018 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb399466-61d7-4dbe-a5af-e6e773d8a378-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:47:02.098234 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.098028 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hdplv\" (UniqueName: \"kubernetes.io/projected/fb399466-61d7-4dbe-a5af-e6e773d8a378-kube-api-access-hdplv\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:47:02.212342 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.212304 2581 generic.go:358] "Generic (PLEG): container finished" podID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerID="2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df" exitCode=0 Apr 24 21:47:02.212543 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.212367 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" event={"ID":"fb399466-61d7-4dbe-a5af-e6e773d8a378","Type":"ContainerDied","Data":"2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df"} Apr 24 21:47:02.212543 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.212409 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" Apr 24 21:47:02.212543 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.212424 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs" event={"ID":"fb399466-61d7-4dbe-a5af-e6e773d8a378","Type":"ContainerDied","Data":"d4f8e79f44810637619b9be016733644f250f7184bcf520653e9c7370c2f4f0c"} Apr 24 21:47:02.212543 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.212442 2581 scope.go:117] "RemoveContainer" containerID="9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827" Apr 24 21:47:02.220010 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.219991 2581 scope.go:117] "RemoveContainer" containerID="2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df" Apr 24 21:47:02.226572 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.226555 2581 scope.go:117] "RemoveContainer" containerID="87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b" Apr 24 21:47:02.232184 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.232163 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs"] Apr 24 21:47:02.233339 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.233322 2581 scope.go:117] "RemoveContainer" containerID="9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827" Apr 24 21:47:02.233591 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:47:02.233571 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827\": container with ID starting with 9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827 not found: ID does not exist" containerID="9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827" Apr 24 21:47:02.233655 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.233598 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827"} err="failed to get container status \"9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827\": rpc error: code = NotFound desc = could not find container \"9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827\": container with ID starting with 9e00b231825a3db003fa39af30433c9c4e784107a9c3b8aa1981a94c86572827 not found: ID does not exist" Apr 24 21:47:02.233655 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.233615 2581 scope.go:117] "RemoveContainer" containerID="2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df" Apr 24 21:47:02.233846 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:47:02.233829 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df\": container with ID starting with 2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df not found: ID does not exist" containerID="2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df" Apr 24 21:47:02.233887 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.233853 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df"} err="failed to get container status \"2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df\": rpc error: code = NotFound desc = could not find container \"2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df\": container with ID starting with 2e3b0b9ed2274b0abdcaeb09e058ecb0181f9365f3097028313940121591a9df not found: ID does not exist" Apr 24 21:47:02.233887 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.233869 2581 scope.go:117] "RemoveContainer" containerID="87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b" Apr 24 21:47:02.234120 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:47:02.234101 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b\": container with ID starting with 87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b not found: ID does not exist" containerID="87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b" Apr 24 21:47:02.234181 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.234127 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b"} err="failed to get container status \"87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b\": rpc error: code = NotFound desc = could not find container \"87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b\": container with ID starting with 87a0f82533f799c66eb0eeaa95e4b0fed16e2c3616d91f1e9705e62b87e3524b not found: ID does not exist" Apr 24 21:47:02.237346 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.237322 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-l9zxs"] Apr 24 21:47:02.949960 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:47:02.949916 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" path="/var/lib/kubelet/pods/fb399466-61d7-4dbe-a5af-e6e773d8a378/volumes" Apr 24 21:48:34.659927 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.659847 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn"] Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660147 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="storage-initializer" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660158 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="storage-initializer" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660169 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kube-rbac-proxy" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660174 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kube-rbac-proxy" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660187 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="storage-initializer" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660193 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="storage-initializer" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660201 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kube-rbac-proxy" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660207 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kube-rbac-proxy" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660214 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660218 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660225 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660231 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660272 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kserve-container" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660278 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kube-rbac-proxy" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660284 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb399466-61d7-4dbe-a5af-e6e773d8a378" containerName="kserve-container" Apr 24 21:48:34.660376 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.660291 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="f17ab9b5-1afd-4e58-acd0-f4b7342419e7" containerName="kube-rbac-proxy" Apr 24 21:48:34.663307 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.663292 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.665664 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.665637 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:48:34.665805 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.665666 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 21:48:34.665805 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.665694 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:48:34.665805 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.665669 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 24 21:48:34.665805 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.665637 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 21:48:34.674211 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.674188 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn"] Apr 24 21:48:34.779124 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.779088 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b6306d6-eda6-4642-be95-7522e91da272-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.779290 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.779131 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b6306d6-eda6-4642-be95-7522e91da272-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.779290 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.779161 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b6306d6-eda6-4642-be95-7522e91da272-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.779290 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.779255 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrbfw\" (UniqueName: \"kubernetes.io/projected/1b6306d6-eda6-4642-be95-7522e91da272-kube-api-access-zrbfw\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.880138 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.880050 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b6306d6-eda6-4642-be95-7522e91da272-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.880266 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.880249 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrbfw\" (UniqueName: \"kubernetes.io/projected/1b6306d6-eda6-4642-be95-7522e91da272-kube-api-access-zrbfw\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.880323 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.880283 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b6306d6-eda6-4642-be95-7522e91da272-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.880323 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.880315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b6306d6-eda6-4642-be95-7522e91da272-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.880458 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:48:34.880445 2581 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-serving-cert: secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 24 21:48:34.880518 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:48:34.880510 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b6306d6-eda6-4642-be95-7522e91da272-proxy-tls podName:1b6306d6-eda6-4642-be95-7522e91da272 nodeName:}" failed. No retries permitted until 2026-04-24 21:48:35.380496535 +0000 UTC m=+1968.989635492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1b6306d6-eda6-4642-be95-7522e91da272-proxy-tls") pod "isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" (UID: "1b6306d6-eda6-4642-be95-7522e91da272") : secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 24 21:48:34.880750 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.880726 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b6306d6-eda6-4642-be95-7522e91da272-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.880820 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.880753 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b6306d6-eda6-4642-be95-7522e91da272-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:34.888524 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:34.888504 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrbfw\" (UniqueName: \"kubernetes.io/projected/1b6306d6-eda6-4642-be95-7522e91da272-kube-api-access-zrbfw\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:35.384447 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:35.384412 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b6306d6-eda6-4642-be95-7522e91da272-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:35.386817 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:35.386795 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b6306d6-eda6-4642-be95-7522e91da272-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:35.573678 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:35.573640 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:48:35.696407 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:35.696280 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn"] Apr 24 21:48:35.699101 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:48:35.699070 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6306d6_eda6_4642_be95_7522e91da272.slice/crio-2b2ccd91a1b5716fcc5f0cd07ef5db267d4d0544582f455c8bfd515b6ac5ddd5 WatchSource:0}: Error finding container 2b2ccd91a1b5716fcc5f0cd07ef5db267d4d0544582f455c8bfd515b6ac5ddd5: Status 404 returned error can't find the container with id 2b2ccd91a1b5716fcc5f0cd07ef5db267d4d0544582f455c8bfd515b6ac5ddd5 Apr 24 21:48:36.473115 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:36.473078 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" event={"ID":"1b6306d6-eda6-4642-be95-7522e91da272","Type":"ContainerStarted","Data":"5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2"} Apr 24 21:48:36.473115 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:36.473115 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" event={"ID":"1b6306d6-eda6-4642-be95-7522e91da272","Type":"ContainerStarted","Data":"2b2ccd91a1b5716fcc5f0cd07ef5db267d4d0544582f455c8bfd515b6ac5ddd5"} Apr 24 21:48:39.486042 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:39.486006 2581 generic.go:358] "Generic (PLEG): container finished" podID="1b6306d6-eda6-4642-be95-7522e91da272" containerID="5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2" exitCode=0 Apr 24 21:48:39.486485 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:39.486077 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" event={"ID":"1b6306d6-eda6-4642-be95-7522e91da272","Type":"ContainerDied","Data":"5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2"} Apr 24 21:48:39.487267 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:48:39.487253 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:49:01.565359 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:01.565322 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" event={"ID":"1b6306d6-eda6-4642-be95-7522e91da272","Type":"ContainerStarted","Data":"3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f"} Apr 24 21:49:01.565359 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:01.565360 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" event={"ID":"1b6306d6-eda6-4642-be95-7522e91da272","Type":"ContainerStarted","Data":"fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2"} Apr 24 21:49:01.565859 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:01.565675 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:49:01.565859 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:01.565824 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:49:01.567109 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:01.567080 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:49:01.582101 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:01.582060 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podStartSLOduration=6.536604704 podStartE2EDuration="27.582034387s" podCreationTimestamp="2026-04-24 21:48:34 +0000 UTC" firstStartedPulling="2026-04-24 21:48:39.487368851 +0000 UTC m=+1973.096507809" lastFinishedPulling="2026-04-24 21:49:00.532798524 +0000 UTC m=+1994.141937492" observedRunningTime="2026-04-24 21:49:01.581332261 +0000 UTC m=+1995.190471239" watchObservedRunningTime="2026-04-24 21:49:01.582034387 +0000 UTC m=+1995.191173347" Apr 24 21:49:02.569413 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:02.569359 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:49:07.574293 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:07.574261 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:49:07.574876 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:07.574848 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:49:17.575795 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:17.575753 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:49:27.575580 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:27.575531 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:49:37.575371 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:37.575324 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:49:47.575187 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:47.575147 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:49:57.575646 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:49:57.575601 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:50:07.575802 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:07.575716 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:50:17.576054 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:17.576020 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:50:24.782223 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:24.782190 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn"] Apr 24 21:50:24.782710 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:24.782615 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" containerID="cri-o://fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2" gracePeriod=30 Apr 24 21:50:24.782710 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:24.782633 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kube-rbac-proxy" containerID="cri-o://3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f" gracePeriod=30 Apr 24 21:50:25.804941 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:25.804906 2581 generic.go:358] "Generic (PLEG): container finished" podID="1b6306d6-eda6-4642-be95-7522e91da272" containerID="3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f" exitCode=2 Apr 24 21:50:25.805307 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:25.804973 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" event={"ID":"1b6306d6-eda6-4642-be95-7522e91da272","Type":"ContainerDied","Data":"3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f"} Apr 24 21:50:27.570561 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:27.570518 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.23:8643/healthz\": dial tcp 10.133.0.23:8643: connect: connection refused" Apr 24 21:50:27.574856 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:27.574826 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:50:29.525613 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.525590 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:50:29.693388 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.693339 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b6306d6-eda6-4642-be95-7522e91da272-proxy-tls\") pod \"1b6306d6-eda6-4642-be95-7522e91da272\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " Apr 24 21:50:29.693612 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.693437 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b6306d6-eda6-4642-be95-7522e91da272-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"1b6306d6-eda6-4642-be95-7522e91da272\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " Apr 24 21:50:29.693612 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.693464 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrbfw\" (UniqueName: \"kubernetes.io/projected/1b6306d6-eda6-4642-be95-7522e91da272-kube-api-access-zrbfw\") pod \"1b6306d6-eda6-4642-be95-7522e91da272\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " Apr 24 21:50:29.693612 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.693495 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b6306d6-eda6-4642-be95-7522e91da272-kserve-provision-location\") pod \"1b6306d6-eda6-4642-be95-7522e91da272\" (UID: \"1b6306d6-eda6-4642-be95-7522e91da272\") " Apr 24 21:50:29.693825 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.693797 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b6306d6-eda6-4642-be95-7522e91da272-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "1b6306d6-eda6-4642-be95-7522e91da272" (UID: "1b6306d6-eda6-4642-be95-7522e91da272"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:29.693890 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.693842 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b6306d6-eda6-4642-be95-7522e91da272-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b6306d6-eda6-4642-be95-7522e91da272" (UID: "1b6306d6-eda6-4642-be95-7522e91da272"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:29.695537 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.695515 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b6306d6-eda6-4642-be95-7522e91da272-kube-api-access-zrbfw" (OuterVolumeSpecName: "kube-api-access-zrbfw") pod "1b6306d6-eda6-4642-be95-7522e91da272" (UID: "1b6306d6-eda6-4642-be95-7522e91da272"). InnerVolumeSpecName "kube-api-access-zrbfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:29.695604 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.695515 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6306d6-eda6-4642-be95-7522e91da272-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1b6306d6-eda6-4642-be95-7522e91da272" (UID: "1b6306d6-eda6-4642-be95-7522e91da272"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:29.794155 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.794119 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b6306d6-eda6-4642-be95-7522e91da272-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:50:29.794155 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.794147 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b6306d6-eda6-4642-be95-7522e91da272-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:50:29.794155 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.794157 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b6306d6-eda6-4642-be95-7522e91da272-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:50:29.794155 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.794167 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrbfw\" (UniqueName: \"kubernetes.io/projected/1b6306d6-eda6-4642-be95-7522e91da272-kube-api-access-zrbfw\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:50:29.820052 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.820021 2581 generic.go:358] "Generic (PLEG): container finished" podID="1b6306d6-eda6-4642-be95-7522e91da272" containerID="fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2" exitCode=0 Apr 24 21:50:29.820180 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.820063 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" event={"ID":"1b6306d6-eda6-4642-be95-7522e91da272","Type":"ContainerDied","Data":"fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2"} Apr 24 21:50:29.820180 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.820092 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" event={"ID":"1b6306d6-eda6-4642-be95-7522e91da272","Type":"ContainerDied","Data":"2b2ccd91a1b5716fcc5f0cd07ef5db267d4d0544582f455c8bfd515b6ac5ddd5"} Apr 24 21:50:29.820180 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.820096 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn" Apr 24 21:50:29.820180 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.820111 2581 scope.go:117] "RemoveContainer" containerID="3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f" Apr 24 21:50:29.828044 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.827836 2581 scope.go:117] "RemoveContainer" containerID="fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2" Apr 24 21:50:29.834483 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.834467 2581 scope.go:117] "RemoveContainer" containerID="5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2" Apr 24 21:50:29.841570 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.841549 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn"] Apr 24 21:50:29.841734 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.841721 2581 scope.go:117] "RemoveContainer" containerID="3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f" Apr 24 21:50:29.841957 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:50:29.841939 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f\": container with ID starting with 3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f not found: ID does not exist" containerID="3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f" Apr 24 21:50:29.841994 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.841966 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f"} err="failed to get container status \"3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f\": rpc error: code = NotFound desc = could not find container \"3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f\": container with ID starting with 3818ad83129ea0b15e27edf89979219f3a6ee47eb3d44d96609fb5a810a8059f not found: ID does not exist" Apr 24 21:50:29.841994 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.841983 2581 scope.go:117] "RemoveContainer" containerID="fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2" Apr 24 21:50:29.842189 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:50:29.842172 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2\": container with ID starting with fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2 not found: ID does not exist" containerID="fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2" Apr 24 21:50:29.842246 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.842193 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2"} err="failed to get container status \"fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2\": rpc error: code = NotFound desc = could not find container \"fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2\": container with ID starting with fc346a4a60a53824b0faac347ad760ae087a58f82c19aca3784bb8a36c0ec2c2 not found: ID does not exist" Apr 24 21:50:29.842246 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.842215 2581 scope.go:117] "RemoveContainer" containerID="5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2" Apr 24 21:50:29.842455 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:50:29.842427 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2\": container with ID starting with 5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2 not found: ID does not exist" containerID="5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2" Apr 24 21:50:29.842561 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.842460 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2"} err="failed to get container status \"5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2\": rpc error: code = NotFound desc = could not find container \"5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2\": container with ID starting with 5ef175ea8d06f10f1a3004a5062162b1a66cdc4aec5ec65cfd42eedddad956b2 not found: ID does not exist" Apr 24 21:50:29.844228 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:29.844207 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-x5cmn"] Apr 24 21:50:30.950342 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:30.950310 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b6306d6-eda6-4642-be95-7522e91da272" path="/var/lib/kubelet/pods/1b6306d6-eda6-4642-be95-7522e91da272/volumes" Apr 24 21:50:46.984006 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:46.983979 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:50:46.986651 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:50:46.986632 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:52:05.091900 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.091865 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k"] Apr 24 21:52:05.092413 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.092135 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="storage-initializer" Apr 24 21:52:05.092413 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.092146 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="storage-initializer" Apr 24 21:52:05.092413 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.092154 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" Apr 24 21:52:05.092413 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.092160 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" Apr 24 21:52:05.092413 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.092168 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kube-rbac-proxy" Apr 24 21:52:05.092413 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.092174 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kube-rbac-proxy" Apr 24 21:52:05.092413 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.092219 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kube-rbac-proxy" Apr 24 21:52:05.092413 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.092226 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b6306d6-eda6-4642-be95-7522e91da272" containerName="kserve-container" Apr 24 21:52:05.095125 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.095110 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.097053 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.097032 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 24 21:52:05.097053 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.097049 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 21:52:05.097307 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.097291 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:52:05.097426 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.097383 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:52:05.097558 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.097502 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 21:52:05.104691 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.104670 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k"] Apr 24 21:52:05.198218 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.198179 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s5j\" (UniqueName: \"kubernetes.io/projected/6249ac83-c5dd-4835-bff9-49a2b34befeb-kube-api-access-h8s5j\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.198218 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.198217 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6249ac83-c5dd-4835-bff9-49a2b34befeb-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.198494 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.198236 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6249ac83-c5dd-4835-bff9-49a2b34befeb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.198494 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.198271 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6249ac83-c5dd-4835-bff9-49a2b34befeb-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.299270 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.299241 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6249ac83-c5dd-4835-bff9-49a2b34befeb-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.299464 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.299331 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s5j\" (UniqueName: \"kubernetes.io/projected/6249ac83-c5dd-4835-bff9-49a2b34befeb-kube-api-access-h8s5j\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.299464 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.299354 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6249ac83-c5dd-4835-bff9-49a2b34befeb-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.299464 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.299371 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6249ac83-c5dd-4835-bff9-49a2b34befeb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.299782 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.299746 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6249ac83-c5dd-4835-bff9-49a2b34befeb-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.300088 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.300071 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6249ac83-c5dd-4835-bff9-49a2b34befeb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.301878 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.301851 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6249ac83-c5dd-4835-bff9-49a2b34befeb-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.306975 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.306950 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s5j\" (UniqueName: \"kubernetes.io/projected/6249ac83-c5dd-4835-bff9-49a2b34befeb-kube-api-access-h8s5j\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.405838 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.405809 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:05.525929 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:05.525734 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k"] Apr 24 21:52:05.528697 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:52:05.528670 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6249ac83_c5dd_4835_bff9_49a2b34befeb.slice/crio-05e8e3fba59172636bf713169a495d8f0bfd7d514717fb4db11e5afb609be898 WatchSource:0}: Error finding container 05e8e3fba59172636bf713169a495d8f0bfd7d514717fb4db11e5afb609be898: Status 404 returned error can't find the container with id 05e8e3fba59172636bf713169a495d8f0bfd7d514717fb4db11e5afb609be898 Apr 24 21:52:06.076604 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:06.076563 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" event={"ID":"6249ac83-c5dd-4835-bff9-49a2b34befeb","Type":"ContainerStarted","Data":"5f03666c1fa6183c6e802725bb37421896d7901513a306773da94ecefaa7d322"} Apr 24 21:52:06.076604 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:06.076600 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" event={"ID":"6249ac83-c5dd-4835-bff9-49a2b34befeb","Type":"ContainerStarted","Data":"05e8e3fba59172636bf713169a495d8f0bfd7d514717fb4db11e5afb609be898"} Apr 24 21:52:10.089096 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:10.089063 2581 generic.go:358] "Generic (PLEG): container finished" podID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerID="5f03666c1fa6183c6e802725bb37421896d7901513a306773da94ecefaa7d322" exitCode=0 Apr 24 21:52:10.089518 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:10.089113 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" event={"ID":"6249ac83-c5dd-4835-bff9-49a2b34befeb","Type":"ContainerDied","Data":"5f03666c1fa6183c6e802725bb37421896d7901513a306773da94ecefaa7d322"} Apr 24 21:52:11.093188 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:11.093148 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" event={"ID":"6249ac83-c5dd-4835-bff9-49a2b34befeb","Type":"ContainerStarted","Data":"bb7de130379f55fead8a3d0f3df6e0e95c9bad6155a7ad1b2f03b3d46f9cd913"} Apr 24 21:52:11.093188 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:11.093185 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" event={"ID":"6249ac83-c5dd-4835-bff9-49a2b34befeb","Type":"ContainerStarted","Data":"64f0195b87bfb2c032c99a2a981623bfb21e2a1ce5e5c7b28b64db77c856acb2"} Apr 24 21:52:11.093661 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:11.093376 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:11.110946 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:11.110894 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podStartSLOduration=6.110878297 podStartE2EDuration="6.110878297s" podCreationTimestamp="2026-04-24 21:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:52:11.109331652 +0000 UTC m=+2184.718470642" watchObservedRunningTime="2026-04-24 21:52:11.110878297 +0000 UTC m=+2184.720017276" Apr 24 21:52:12.096141 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:12.096107 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:12.097375 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:12.097341 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:52:13.100489 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:13.100441 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:52:18.106726 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:18.106687 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:52:18.107224 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:18.107196 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:52:28.107899 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:28.107856 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:52:38.107465 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:38.107418 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:52:48.107678 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:48.107637 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:52:58.107780 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:52:58.107739 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:53:08.107271 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:08.107177 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:53:18.107870 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:18.107825 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:53:28.108550 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:28.108515 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:53:35.329969 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:35.329937 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k"] Apr 24 21:53:35.330477 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:35.330264 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" containerID="cri-o://64f0195b87bfb2c032c99a2a981623bfb21e2a1ce5e5c7b28b64db77c856acb2" gracePeriod=30 Apr 24 21:53:35.330477 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:35.330343 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kube-rbac-proxy" containerID="cri-o://bb7de130379f55fead8a3d0f3df6e0e95c9bad6155a7ad1b2f03b3d46f9cd913" gracePeriod=30 Apr 24 21:53:36.327774 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:36.327743 2581 generic.go:358] "Generic (PLEG): container finished" podID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerID="bb7de130379f55fead8a3d0f3df6e0e95c9bad6155a7ad1b2f03b3d46f9cd913" exitCode=2 Apr 24 21:53:36.327949 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:36.327790 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" event={"ID":"6249ac83-c5dd-4835-bff9-49a2b34befeb","Type":"ContainerDied","Data":"bb7de130379f55fead8a3d0f3df6e0e95c9bad6155a7ad1b2f03b3d46f9cd913"} Apr 24 21:53:38.101157 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:38.101107 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 24 21:53:38.108067 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:38.108037 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:53:40.341919 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.341886 2581 generic.go:358] "Generic (PLEG): container finished" podID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerID="64f0195b87bfb2c032c99a2a981623bfb21e2a1ce5e5c7b28b64db77c856acb2" exitCode=0 Apr 24 21:53:40.342275 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.341934 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" event={"ID":"6249ac83-c5dd-4835-bff9-49a2b34befeb","Type":"ContainerDied","Data":"64f0195b87bfb2c032c99a2a981623bfb21e2a1ce5e5c7b28b64db77c856acb2"} Apr 24 21:53:40.459112 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.459089 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:53:40.528297 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.528268 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6249ac83-c5dd-4835-bff9-49a2b34befeb-kserve-provision-location\") pod \"6249ac83-c5dd-4835-bff9-49a2b34befeb\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " Apr 24 21:53:40.528482 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.528311 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6249ac83-c5dd-4835-bff9-49a2b34befeb-proxy-tls\") pod \"6249ac83-c5dd-4835-bff9-49a2b34befeb\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " Apr 24 21:53:40.528482 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.528345 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8s5j\" (UniqueName: \"kubernetes.io/projected/6249ac83-c5dd-4835-bff9-49a2b34befeb-kube-api-access-h8s5j\") pod \"6249ac83-c5dd-4835-bff9-49a2b34befeb\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " Apr 24 21:53:40.528721 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.528684 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6249ac83-c5dd-4835-bff9-49a2b34befeb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6249ac83-c5dd-4835-bff9-49a2b34befeb" (UID: "6249ac83-c5dd-4835-bff9-49a2b34befeb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:53:40.530519 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.530488 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6249ac83-c5dd-4835-bff9-49a2b34befeb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6249ac83-c5dd-4835-bff9-49a2b34befeb" (UID: "6249ac83-c5dd-4835-bff9-49a2b34befeb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:53:40.530519 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.530496 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6249ac83-c5dd-4835-bff9-49a2b34befeb-kube-api-access-h8s5j" (OuterVolumeSpecName: "kube-api-access-h8s5j") pod "6249ac83-c5dd-4835-bff9-49a2b34befeb" (UID: "6249ac83-c5dd-4835-bff9-49a2b34befeb"). InnerVolumeSpecName "kube-api-access-h8s5j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:53:40.628932 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.628832 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6249ac83-c5dd-4835-bff9-49a2b34befeb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"6249ac83-c5dd-4835-bff9-49a2b34befeb\" (UID: \"6249ac83-c5dd-4835-bff9-49a2b34befeb\") " Apr 24 21:53:40.629084 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.628976 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6249ac83-c5dd-4835-bff9-49a2b34befeb-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:53:40.629084 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.628988 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6249ac83-c5dd-4835-bff9-49a2b34befeb-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:53:40.629084 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.628998 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8s5j\" (UniqueName: \"kubernetes.io/projected/6249ac83-c5dd-4835-bff9-49a2b34befeb-kube-api-access-h8s5j\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:53:40.629212 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.629187 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6249ac83-c5dd-4835-bff9-49a2b34befeb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "6249ac83-c5dd-4835-bff9-49a2b34befeb" (UID: "6249ac83-c5dd-4835-bff9-49a2b34befeb"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:53:40.729286 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:40.729246 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6249ac83-c5dd-4835-bff9-49a2b34befeb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:53:41.346224 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:41.346139 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" event={"ID":"6249ac83-c5dd-4835-bff9-49a2b34befeb","Type":"ContainerDied","Data":"05e8e3fba59172636bf713169a495d8f0bfd7d514717fb4db11e5afb609be898"} Apr 24 21:53:41.346224 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:41.346186 2581 scope.go:117] "RemoveContainer" containerID="bb7de130379f55fead8a3d0f3df6e0e95c9bad6155a7ad1b2f03b3d46f9cd913" Apr 24 21:53:41.346751 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:41.346155 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k" Apr 24 21:53:41.355804 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:41.355781 2581 scope.go:117] "RemoveContainer" containerID="64f0195b87bfb2c032c99a2a981623bfb21e2a1ce5e5c7b28b64db77c856acb2" Apr 24 21:53:41.363174 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:41.363075 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k"] Apr 24 21:53:41.363247 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:41.363226 2581 scope.go:117] "RemoveContainer" containerID="5f03666c1fa6183c6e802725bb37421896d7901513a306773da94ecefaa7d322" Apr 24 21:53:41.366316 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:41.366296 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-rfd5k"] Apr 24 21:53:42.949911 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:53:42.949878 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" path="/var/lib/kubelet/pods/6249ac83-c5dd-4835-bff9-49a2b34befeb/volumes" Apr 24 21:54:55.554877 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.554843 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp"] Apr 24 21:54:55.555349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.555119 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kube-rbac-proxy" Apr 24 21:54:55.555349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.555132 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kube-rbac-proxy" Apr 24 21:54:55.555349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.555152 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="storage-initializer" Apr 24 21:54:55.555349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.555160 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="storage-initializer" Apr 24 21:54:55.555349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.555177 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" Apr 24 21:54:55.555349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.555182 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" Apr 24 21:54:55.555349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.555255 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kserve-container" Apr 24 21:54:55.555349 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.555263 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6249ac83-c5dd-4835-bff9-49a2b34befeb" containerName="kube-rbac-proxy" Apr 24 21:54:55.558167 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.558152 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.560567 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.560546 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 21:54:55.560844 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.560828 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 24 21:54:55.560896 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.560836 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 21:54:55.560971 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.560876 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:54:55.561105 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.561068 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:54:55.568139 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.568117 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp"] Apr 24 21:54:55.599720 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.599684 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.599859 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.599726 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mhkb\" (UniqueName: \"kubernetes.io/projected/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kube-api-access-8mhkb\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.599859 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.599804 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.599941 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.599864 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.700479 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.700445 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.700643 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.700486 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.700643 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.700525 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.700643 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.700542 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mhkb\" (UniqueName: \"kubernetes.io/projected/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kube-api-access-8mhkb\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.700968 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.700946 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.701234 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.701214 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.702909 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.702893 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.710348 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.710318 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mhkb\" (UniqueName: \"kubernetes.io/projected/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kube-api-access-8mhkb\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.868530 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.868441 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:54:55.989204 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.989179 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp"] Apr 24 21:54:55.990900 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:54:55.990866 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bfd378e_ba72_46bf_92bc_8eb01d0acdc7.slice/crio-7e65dff12fa06223feb117dd867bf937fd01d6988186226c13a3c83f84847c20 WatchSource:0}: Error finding container 7e65dff12fa06223feb117dd867bf937fd01d6988186226c13a3c83f84847c20: Status 404 returned error can't find the container with id 7e65dff12fa06223feb117dd867bf937fd01d6988186226c13a3c83f84847c20 Apr 24 21:54:55.992697 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:55.992679 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:54:56.551293 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:56.551255 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" event={"ID":"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7","Type":"ContainerStarted","Data":"36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0"} Apr 24 21:54:56.551293 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:54:56.551296 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" event={"ID":"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7","Type":"ContainerStarted","Data":"7e65dff12fa06223feb117dd867bf937fd01d6988186226c13a3c83f84847c20"} Apr 24 21:55:00.563857 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:00.563822 2581 generic.go:358] "Generic (PLEG): container finished" podID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerID="36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0" exitCode=0 Apr 24 21:55:00.564244 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:00.563869 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" event={"ID":"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7","Type":"ContainerDied","Data":"36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0"} Apr 24 21:55:01.568757 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:01.568717 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" event={"ID":"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7","Type":"ContainerStarted","Data":"39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35"} Apr 24 21:55:01.568757 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:01.568760 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" event={"ID":"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7","Type":"ContainerStarted","Data":"a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa"} Apr 24 21:55:01.569274 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:01.568984 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:55:01.587246 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:01.587201 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" podStartSLOduration=6.587189051 podStartE2EDuration="6.587189051s" podCreationTimestamp="2026-04-24 21:54:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:55:01.585315317 +0000 UTC m=+2355.194454296" watchObservedRunningTime="2026-04-24 21:55:01.587189051 +0000 UTC m=+2355.196328031" Apr 24 21:55:02.572081 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:02.572051 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:55:08.580340 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:08.580313 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:55:38.581893 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:38.581805 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:55:47.001922 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:47.001889 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:55:47.005614 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:47.005593 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 21:55:48.581843 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:48.581804 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:55:58.581091 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:55:58.581050 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:56:08.581283 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:08.581205 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:56:18.584028 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:18.584001 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:56:25.673622 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.673587 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp"] Apr 24 21:56:25.674101 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.673962 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kserve-container" containerID="cri-o://a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa" gracePeriod=30 Apr 24 21:56:25.674101 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.674031 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kube-rbac-proxy" containerID="cri-o://39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35" gracePeriod=30 Apr 24 21:56:25.785892 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.785853 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66"] Apr 24 21:56:25.789206 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.789188 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:25.791236 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.791219 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 24 21:56:25.791498 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.791477 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 24 21:56:25.798878 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.798855 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66"] Apr 24 21:56:25.814175 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.814143 2581 generic.go:358] "Generic (PLEG): container finished" podID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerID="39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35" exitCode=2 Apr 24 21:56:25.814331 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.814203 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" event={"ID":"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7","Type":"ContainerDied","Data":"39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35"} Apr 24 21:56:25.902972 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.902916 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb56w\" (UniqueName: \"kubernetes.io/projected/4553b61a-1162-46cd-a4d8-528f599d6347-kube-api-access-rb56w\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:25.903152 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.903042 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4553b61a-1162-46cd-a4d8-528f599d6347-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:25.903152 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.903102 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4553b61a-1162-46cd-a4d8-528f599d6347-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:25.903152 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:25.903123 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4553b61a-1162-46cd-a4d8-528f599d6347-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.003864 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.003758 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rb56w\" (UniqueName: \"kubernetes.io/projected/4553b61a-1162-46cd-a4d8-528f599d6347-kube-api-access-rb56w\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.004030 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.003899 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4553b61a-1162-46cd-a4d8-528f599d6347-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.004030 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.003973 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4553b61a-1162-46cd-a4d8-528f599d6347-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.004030 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:56:26.003999 2581 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-serving-cert: secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 24 21:56:26.004166 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:56:26.004073 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4553b61a-1162-46cd-a4d8-528f599d6347-proxy-tls podName:4553b61a-1162-46cd-a4d8-528f599d6347 nodeName:}" failed. No retries permitted until 2026-04-24 21:56:26.50405274 +0000 UTC m=+2440.113191730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4553b61a-1162-46cd-a4d8-528f599d6347-proxy-tls") pod "isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" (UID: "4553b61a-1162-46cd-a4d8-528f599d6347") : secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 24 21:56:26.004166 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.004001 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4553b61a-1162-46cd-a4d8-528f599d6347-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.004522 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.004498 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4553b61a-1162-46cd-a4d8-528f599d6347-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.004722 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.004704 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4553b61a-1162-46cd-a4d8-528f599d6347-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.014664 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.014638 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb56w\" (UniqueName: \"kubernetes.io/projected/4553b61a-1162-46cd-a4d8-528f599d6347-kube-api-access-rb56w\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.506428 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.506382 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4553b61a-1162-46cd-a4d8-528f599d6347-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.519880 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.519851 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4553b61a-1162-46cd-a4d8-528f599d6347-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.699711 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.699665 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:26.816657 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:26.816630 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66"] Apr 24 21:56:26.822748 ip-10-0-132-159 kubenswrapper[2581]: W0424 21:56:26.822505 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4553b61a_1162_46cd_a4d8_528f599d6347.slice/crio-167dd4011a5b5d6589b2d0c7898f15642a5d68a131cce9888d4bc65918561e06 WatchSource:0}: Error finding container 167dd4011a5b5d6589b2d0c7898f15642a5d68a131cce9888d4bc65918561e06: Status 404 returned error can't find the container with id 167dd4011a5b5d6589b2d0c7898f15642a5d68a131cce9888d4bc65918561e06 Apr 24 21:56:27.822280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:27.822243 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" event={"ID":"4553b61a-1162-46cd-a4d8-528f599d6347","Type":"ContainerStarted","Data":"7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12"} Apr 24 21:56:27.822280 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:27.822280 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" event={"ID":"4553b61a-1162-46cd-a4d8-528f599d6347","Type":"ContainerStarted","Data":"167dd4011a5b5d6589b2d0c7898f15642a5d68a131cce9888d4bc65918561e06"} Apr 24 21:56:28.575748 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:28.575708 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 24 21:56:28.581096 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:28.581068 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:56:30.413785 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.413760 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:56:30.541888 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.541819 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-proxy-tls\") pod \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " Apr 24 21:56:30.541888 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.541868 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " Apr 24 21:56:30.542082 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.541908 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mhkb\" (UniqueName: \"kubernetes.io/projected/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kube-api-access-8mhkb\") pod \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " Apr 24 21:56:30.542082 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.541943 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kserve-provision-location\") pod \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\" (UID: \"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7\") " Apr 24 21:56:30.542332 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.542300 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" (UID: "3bfd378e-ba72-46bf-92bc-8eb01d0acdc7"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:56:30.542428 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.542321 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" (UID: "3bfd378e-ba72-46bf-92bc-8eb01d0acdc7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:30.543921 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.543894 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kube-api-access-8mhkb" (OuterVolumeSpecName: "kube-api-access-8mhkb") pod "3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" (UID: "3bfd378e-ba72-46bf-92bc-8eb01d0acdc7"). InnerVolumeSpecName "kube-api-access-8mhkb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:30.543921 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.543913 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" (UID: "3bfd378e-ba72-46bf-92bc-8eb01d0acdc7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:30.642998 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.642957 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:56:30.642998 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.642990 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8mhkb\" (UniqueName: \"kubernetes.io/projected/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kube-api-access-8mhkb\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:56:30.642998 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.643001 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:56:30.642998 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.643011 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:56:30.833099 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.833010 2581 generic.go:358] "Generic (PLEG): container finished" podID="4553b61a-1162-46cd-a4d8-528f599d6347" containerID="7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12" exitCode=0 Apr 24 21:56:30.833254 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.833089 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" event={"ID":"4553b61a-1162-46cd-a4d8-528f599d6347","Type":"ContainerDied","Data":"7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12"} Apr 24 21:56:30.834917 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.834894 2581 generic.go:358] "Generic (PLEG): container finished" podID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerID="a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa" exitCode=0 Apr 24 21:56:30.835031 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.834923 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" event={"ID":"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7","Type":"ContainerDied","Data":"a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa"} Apr 24 21:56:30.835031 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.834944 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" event={"ID":"3bfd378e-ba72-46bf-92bc-8eb01d0acdc7","Type":"ContainerDied","Data":"7e65dff12fa06223feb117dd867bf937fd01d6988186226c13a3c83f84847c20"} Apr 24 21:56:30.835031 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.834959 2581 scope.go:117] "RemoveContainer" containerID="39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35" Apr 24 21:56:30.835031 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.834971 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp" Apr 24 21:56:30.843432 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.843332 2581 scope.go:117] "RemoveContainer" containerID="a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa" Apr 24 21:56:30.850460 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.850420 2581 scope.go:117] "RemoveContainer" containerID="36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0" Apr 24 21:56:30.857719 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.857697 2581 scope.go:117] "RemoveContainer" containerID="39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35" Apr 24 21:56:30.857967 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:56:30.857944 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35\": container with ID starting with 39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35 not found: ID does not exist" containerID="39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35" Apr 24 21:56:30.858040 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.857974 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35"} err="failed to get container status \"39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35\": rpc error: code = NotFound desc = could not find container \"39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35\": container with ID starting with 39431402ddf5bf6ba0622f636e0adab935b0e17326a1163b8c98a12917575f35 not found: ID does not exist" Apr 24 21:56:30.858040 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.857993 2581 scope.go:117] "RemoveContainer" containerID="a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa" Apr 24 21:56:30.858223 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:56:30.858204 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa\": container with ID starting with a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa not found: ID does not exist" containerID="a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa" Apr 24 21:56:30.858282 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.858227 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa"} err="failed to get container status \"a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa\": rpc error: code = NotFound desc = could not find container \"a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa\": container with ID starting with a516ddae93d4566c70be914487d22702522387ba56c47c6b0a8ce4cc6f9e5efa not found: ID does not exist" Apr 24 21:56:30.858282 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.858243 2581 scope.go:117] "RemoveContainer" containerID="36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0" Apr 24 21:56:30.858560 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:56:30.858537 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0\": container with ID starting with 36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0 not found: ID does not exist" containerID="36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0" Apr 24 21:56:30.858634 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.858567 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0"} err="failed to get container status \"36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0\": rpc error: code = NotFound desc = could not find container \"36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0\": container with ID starting with 36cf9eb058045510b6d5a346f1adf1101aa5a55c77796e4eb10bd1b5e98815b0 not found: ID does not exist" Apr 24 21:56:30.864292 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.864271 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp"] Apr 24 21:56:30.876902 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.876881 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-hwdzp"] Apr 24 21:56:30.950201 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:30.950177 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" path="/var/lib/kubelet/pods/3bfd378e-ba72-46bf-92bc-8eb01d0acdc7/volumes" Apr 24 21:56:31.840086 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:31.840054 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" event={"ID":"4553b61a-1162-46cd-a4d8-528f599d6347","Type":"ContainerStarted","Data":"8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1"} Apr 24 21:56:31.840544 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:31.840096 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" event={"ID":"4553b61a-1162-46cd-a4d8-528f599d6347","Type":"ContainerStarted","Data":"6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8"} Apr 24 21:56:31.840544 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:31.840305 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:31.858589 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:31.858542 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podStartSLOduration=6.858527537 podStartE2EDuration="6.858527537s" podCreationTimestamp="2026-04-24 21:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:31.857064437 +0000 UTC m=+2445.466203413" watchObservedRunningTime="2026-04-24 21:56:31.858527537 +0000 UTC m=+2445.467666517" Apr 24 21:56:32.844619 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:32.844586 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:56:38.856023 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:56:38.855990 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:57:08.857539 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:08.857500 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:57:18.857261 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:18.857219 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:57:28.857243 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:28.857207 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:57:38.856904 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:38.856818 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:57:40.946740 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:40.946699 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:57:50.950527 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:50.950491 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:57:55.850073 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:55.850036 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66"] Apr 24 21:57:55.850537 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:55.850466 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" containerID="cri-o://6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8" gracePeriod=30 Apr 24 21:57:55.850612 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:55.850513 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kube-rbac-proxy" containerID="cri-o://8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1" gracePeriod=30 Apr 24 21:57:56.082439 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:56.082382 2581 generic.go:358] "Generic (PLEG): container finished" podID="4553b61a-1162-46cd-a4d8-528f599d6347" containerID="8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1" exitCode=2 Apr 24 21:57:56.082439 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:56.082438 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" event={"ID":"4553b61a-1162-46cd-a4d8-528f599d6347","Type":"ContainerDied","Data":"8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1"} Apr 24 21:57:58.850748 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:57:58.850699 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 21:58:00.946663 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:00.946617 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:58:01.097561 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.097536 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:58:01.097867 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.097833 2581 generic.go:358] "Generic (PLEG): container finished" podID="4553b61a-1162-46cd-a4d8-528f599d6347" containerID="6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8" exitCode=0 Apr 24 21:58:01.097959 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.097897 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" event={"ID":"4553b61a-1162-46cd-a4d8-528f599d6347","Type":"ContainerDied","Data":"6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8"} Apr 24 21:58:01.097959 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.097936 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" event={"ID":"4553b61a-1162-46cd-a4d8-528f599d6347","Type":"ContainerDied","Data":"167dd4011a5b5d6589b2d0c7898f15642a5d68a131cce9888d4bc65918561e06"} Apr 24 21:58:01.097959 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.097956 2581 scope.go:117] "RemoveContainer" containerID="8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1" Apr 24 21:58:01.107558 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.107481 2581 scope.go:117] "RemoveContainer" containerID="6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8" Apr 24 21:58:01.114990 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.114964 2581 scope.go:117] "RemoveContainer" containerID="7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12" Apr 24 21:58:01.122926 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.122906 2581 scope.go:117] "RemoveContainer" containerID="8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1" Apr 24 21:58:01.123210 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:58:01.123191 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1\": container with ID starting with 8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1 not found: ID does not exist" containerID="8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1" Apr 24 21:58:01.123294 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.123218 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1"} err="failed to get container status \"8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1\": rpc error: code = NotFound desc = could not find container \"8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1\": container with ID starting with 8130d08891c67e7ad5fbcfc002fd9ed7933d37af870c90d3a9f527dc8894d0f1 not found: ID does not exist" Apr 24 21:58:01.123294 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.123236 2581 scope.go:117] "RemoveContainer" containerID="6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8" Apr 24 21:58:01.123497 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:58:01.123480 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8\": container with ID starting with 6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8 not found: ID does not exist" containerID="6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8" Apr 24 21:58:01.123546 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.123501 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8"} err="failed to get container status \"6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8\": rpc error: code = NotFound desc = could not find container \"6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8\": container with ID starting with 6fe3d8d6a7f3cadd3727433871411f3b265ad4d0de62f7dde59458d12d5cbbe8 not found: ID does not exist" Apr 24 21:58:01.123546 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.123516 2581 scope.go:117] "RemoveContainer" containerID="7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12" Apr 24 21:58:01.123721 ip-10-0-132-159 kubenswrapper[2581]: E0424 21:58:01.123707 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12\": container with ID starting with 7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12 not found: ID does not exist" containerID="7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12" Apr 24 21:58:01.123764 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.123726 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12"} err="failed to get container status \"7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12\": rpc error: code = NotFound desc = could not find container \"7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12\": container with ID starting with 7c78dc27b2f4133dd80f40ac77e3f220c1e5be938341416333aa1784bdef7a12 not found: ID does not exist" Apr 24 21:58:01.239208 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.239116 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4553b61a-1162-46cd-a4d8-528f599d6347-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"4553b61a-1162-46cd-a4d8-528f599d6347\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " Apr 24 21:58:01.239208 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.239162 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4553b61a-1162-46cd-a4d8-528f599d6347-kserve-provision-location\") pod \"4553b61a-1162-46cd-a4d8-528f599d6347\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " Apr 24 21:58:01.239208 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.239208 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb56w\" (UniqueName: \"kubernetes.io/projected/4553b61a-1162-46cd-a4d8-528f599d6347-kube-api-access-rb56w\") pod \"4553b61a-1162-46cd-a4d8-528f599d6347\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " Apr 24 21:58:01.239504 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.239258 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4553b61a-1162-46cd-a4d8-528f599d6347-proxy-tls\") pod \"4553b61a-1162-46cd-a4d8-528f599d6347\" (UID: \"4553b61a-1162-46cd-a4d8-528f599d6347\") " Apr 24 21:58:01.239613 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.239585 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4553b61a-1162-46cd-a4d8-528f599d6347-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4553b61a-1162-46cd-a4d8-528f599d6347" (UID: "4553b61a-1162-46cd-a4d8-528f599d6347"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:58:01.239679 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.239583 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4553b61a-1162-46cd-a4d8-528f599d6347-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "4553b61a-1162-46cd-a4d8-528f599d6347" (UID: "4553b61a-1162-46cd-a4d8-528f599d6347"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:01.241329 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.241300 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4553b61a-1162-46cd-a4d8-528f599d6347-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4553b61a-1162-46cd-a4d8-528f599d6347" (UID: "4553b61a-1162-46cd-a4d8-528f599d6347"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:58:01.241329 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.241304 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4553b61a-1162-46cd-a4d8-528f599d6347-kube-api-access-rb56w" (OuterVolumeSpecName: "kube-api-access-rb56w") pod "4553b61a-1162-46cd-a4d8-528f599d6347" (UID: "4553b61a-1162-46cd-a4d8-528f599d6347"). InnerVolumeSpecName "kube-api-access-rb56w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:58:01.340181 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.340130 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rb56w\" (UniqueName: \"kubernetes.io/projected/4553b61a-1162-46cd-a4d8-528f599d6347-kube-api-access-rb56w\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:58:01.340181 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.340177 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4553b61a-1162-46cd-a4d8-528f599d6347-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:58:01.340181 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.340188 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4553b61a-1162-46cd-a4d8-528f599d6347-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:58:01.340181 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:01.340199 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4553b61a-1162-46cd-a4d8-528f599d6347-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 21:58:02.101377 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:02.101347 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66" Apr 24 21:58:02.120792 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:02.120761 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66"] Apr 24 21:58:02.126203 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:02.126181 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-hqw66"] Apr 24 21:58:02.949502 ip-10-0-132-159 kubenswrapper[2581]: I0424 21:58:02.949469 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" path="/var/lib/kubelet/pods/4553b61a-1162-46cd-a4d8-528f599d6347/volumes" Apr 24 22:00:47.020786 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:00:47.020756 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:00:47.024509 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:00:47.024487 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:03:56.105117 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105078 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm"] Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105366 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105382 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105417 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="storage-initializer" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105424 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="storage-initializer" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105431 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kserve-container" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105440 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kserve-container" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105453 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kube-rbac-proxy" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105459 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kube-rbac-proxy" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105467 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="storage-initializer" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105474 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="storage-initializer" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105483 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kube-rbac-proxy" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105488 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kube-rbac-proxy" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105541 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kube-rbac-proxy" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105550 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kserve-container" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105556 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bfd378e-ba72-46bf-92bc-8eb01d0acdc7" containerName="kube-rbac-proxy" Apr 24 22:03:56.105709 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.105563 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4553b61a-1162-46cd-a4d8-528f599d6347" containerName="kserve-container" Apr 24 22:03:56.108580 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.108562 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.110510 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.110482 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 24 22:03:56.110652 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.110528 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:03:56.110652 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.110546 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 24 22:03:56.110652 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.110647 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:03:56.110845 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.110825 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 22:03:56.116427 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.116406 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm"] Apr 24 22:03:56.206109 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.206073 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd91356-b9fe-4d4b-9476-9634f7243f78-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.206272 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.206119 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dd91356-b9fe-4d4b-9476-9634f7243f78-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.206272 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.206188 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dd91356-b9fe-4d4b-9476-9634f7243f78-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.206272 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.206206 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvqlh\" (UniqueName: \"kubernetes.io/projected/5dd91356-b9fe-4d4b-9476-9634f7243f78-kube-api-access-tvqlh\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.307247 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.307213 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd91356-b9fe-4d4b-9476-9634f7243f78-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.307422 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.307254 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dd91356-b9fe-4d4b-9476-9634f7243f78-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.307422 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.307304 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dd91356-b9fe-4d4b-9476-9634f7243f78-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.307422 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.307324 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvqlh\" (UniqueName: \"kubernetes.io/projected/5dd91356-b9fe-4d4b-9476-9634f7243f78-kube-api-access-tvqlh\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.307662 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.307640 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd91356-b9fe-4d4b-9476-9634f7243f78-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.307913 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.307892 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dd91356-b9fe-4d4b-9476-9634f7243f78-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.309805 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.309784 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dd91356-b9fe-4d4b-9476-9634f7243f78-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.315345 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.315326 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvqlh\" (UniqueName: \"kubernetes.io/projected/5dd91356-b9fe-4d4b-9476-9634f7243f78-kube-api-access-tvqlh\") pod \"isvc-tensorflow-predictor-6756f669d7-69whm\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.418638 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.418606 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:03:56.540095 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.540064 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm"] Apr 24 22:03:56.543912 ip-10-0-132-159 kubenswrapper[2581]: W0424 22:03:56.543886 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd91356_b9fe_4d4b_9476_9634f7243f78.slice/crio-27a001e3ae4bba9e679328a85d57c4b206b989524ba4d992cbb7a0d22a7f4a26 WatchSource:0}: Error finding container 27a001e3ae4bba9e679328a85d57c4b206b989524ba4d992cbb7a0d22a7f4a26: Status 404 returned error can't find the container with id 27a001e3ae4bba9e679328a85d57c4b206b989524ba4d992cbb7a0d22a7f4a26 Apr 24 22:03:56.545917 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:56.545900 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:03:57.059572 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:57.059535 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" event={"ID":"5dd91356-b9fe-4d4b-9476-9634f7243f78","Type":"ContainerStarted","Data":"43d883cb93b2398e045cd04bcc73ae280fdf24c14eb26a2b8c9cce283b3e68f6"} Apr 24 22:03:57.059729 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:03:57.059579 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" event={"ID":"5dd91356-b9fe-4d4b-9476-9634f7243f78","Type":"ContainerStarted","Data":"27a001e3ae4bba9e679328a85d57c4b206b989524ba4d992cbb7a0d22a7f4a26"} Apr 24 22:04:02.076745 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:02.076703 2581 generic.go:358] "Generic (PLEG): container finished" podID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerID="43d883cb93b2398e045cd04bcc73ae280fdf24c14eb26a2b8c9cce283b3e68f6" exitCode=0 Apr 24 22:04:02.077136 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:02.076776 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" event={"ID":"5dd91356-b9fe-4d4b-9476-9634f7243f78","Type":"ContainerDied","Data":"43d883cb93b2398e045cd04bcc73ae280fdf24c14eb26a2b8c9cce283b3e68f6"} Apr 24 22:04:06.092464 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:06.092429 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" event={"ID":"5dd91356-b9fe-4d4b-9476-9634f7243f78","Type":"ContainerStarted","Data":"23252b964dc385221bfe175cc27bddaa6c5b35ac92cd42eb596a6f318d7c82e7"} Apr 24 22:04:06.092842 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:06.092473 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" event={"ID":"5dd91356-b9fe-4d4b-9476-9634f7243f78","Type":"ContainerStarted","Data":"b615b645c2745dd41f520c5e375f2a7f7c3c0e9ccb43b0b55d9eeaad13cb8ee2"} Apr 24 22:04:06.092842 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:06.092753 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:04:06.092912 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:06.092882 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:04:06.093936 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:06.093914 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 22:04:06.110696 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:06.110656 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podStartSLOduration=6.278274452 podStartE2EDuration="10.110644202s" podCreationTimestamp="2026-04-24 22:03:56 +0000 UTC" firstStartedPulling="2026-04-24 22:04:02.077967178 +0000 UTC m=+2895.687106134" lastFinishedPulling="2026-04-24 22:04:05.910336927 +0000 UTC m=+2899.519475884" observedRunningTime="2026-04-24 22:04:06.108669227 +0000 UTC m=+2899.717808205" watchObservedRunningTime="2026-04-24 22:04:06.110644202 +0000 UTC m=+2899.719783182" Apr 24 22:04:07.095151 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:07.095088 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 22:04:12.105704 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:12.105671 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:04:12.106341 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:12.106307 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 22:04:22.107499 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:22.107464 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:04:37.068224 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.068189 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm"] Apr 24 22:04:37.068853 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.068569 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kserve-container" containerID="cri-o://b615b645c2745dd41f520c5e375f2a7f7c3c0e9ccb43b0b55d9eeaad13cb8ee2" gracePeriod=30 Apr 24 22:04:37.068853 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.068593 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" containerID="cri-o://23252b964dc385221bfe175cc27bddaa6c5b35ac92cd42eb596a6f318d7c82e7" gracePeriod=30 Apr 24 22:04:37.096072 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.096038 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 22:04:37.156940 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.156906 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d"] Apr 24 22:04:37.160149 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.160132 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.162280 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.162255 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 24 22:04:37.162404 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.162351 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:04:37.169655 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.169633 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d"] Apr 24 22:04:37.215155 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.215126 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e836c2d-46d1-477b-9fe7-0a9606a69df8-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.215155 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.215156 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.215329 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.215182 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzt2\" (UniqueName: \"kubernetes.io/projected/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kube-api-access-lvzt2\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.215329 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.215209 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8e836c2d-46d1-477b-9fe7-0a9606a69df8-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.316054 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.316023 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e836c2d-46d1-477b-9fe7-0a9606a69df8-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.316192 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.316065 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.316246 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.316226 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzt2\" (UniqueName: \"kubernetes.io/projected/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kube-api-access-lvzt2\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.316308 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.316292 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8e836c2d-46d1-477b-9fe7-0a9606a69df8-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.316430 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.316412 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.316871 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.316850 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8e836c2d-46d1-477b-9fe7-0a9606a69df8-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.318596 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.318545 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e836c2d-46d1-477b-9fe7-0a9606a69df8-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.323292 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.323263 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzt2\" (UniqueName: \"kubernetes.io/projected/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kube-api-access-lvzt2\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.470017 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.469983 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:37.590857 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:37.590830 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d"] Apr 24 22:04:37.593136 ip-10-0-132-159 kubenswrapper[2581]: W0424 22:04:37.593107 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e836c2d_46d1_477b_9fe7_0a9606a69df8.slice/crio-994ef6514bf3477acaf50ad6e52bc8c677b3ff2189d6a118ce817bd654bcd744 WatchSource:0}: Error finding container 994ef6514bf3477acaf50ad6e52bc8c677b3ff2189d6a118ce817bd654bcd744: Status 404 returned error can't find the container with id 994ef6514bf3477acaf50ad6e52bc8c677b3ff2189d6a118ce817bd654bcd744 Apr 24 22:04:38.180806 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:38.180766 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" event={"ID":"8e836c2d-46d1-477b-9fe7-0a9606a69df8","Type":"ContainerStarted","Data":"92753dbc20c04ac5c691b33b4e4e98258f17b6cd66d634320df46f679c4de7f2"} Apr 24 22:04:38.180806 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:38.180810 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" event={"ID":"8e836c2d-46d1-477b-9fe7-0a9606a69df8","Type":"ContainerStarted","Data":"994ef6514bf3477acaf50ad6e52bc8c677b3ff2189d6a118ce817bd654bcd744"} Apr 24 22:04:38.182542 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:38.182517 2581 generic.go:358] "Generic (PLEG): container finished" podID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerID="23252b964dc385221bfe175cc27bddaa6c5b35ac92cd42eb596a6f318d7c82e7" exitCode=2 Apr 24 22:04:38.182650 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:38.182560 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" event={"ID":"5dd91356-b9fe-4d4b-9476-9634f7243f78","Type":"ContainerDied","Data":"23252b964dc385221bfe175cc27bddaa6c5b35ac92cd42eb596a6f318d7c82e7"} Apr 24 22:04:42.095926 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:42.095886 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 22:04:42.197021 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:42.196986 2581 generic.go:358] "Generic (PLEG): container finished" podID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerID="92753dbc20c04ac5c691b33b4e4e98258f17b6cd66d634320df46f679c4de7f2" exitCode=0 Apr 24 22:04:42.197191 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:42.197059 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" event={"ID":"8e836c2d-46d1-477b-9fe7-0a9606a69df8","Type":"ContainerDied","Data":"92753dbc20c04ac5c691b33b4e4e98258f17b6cd66d634320df46f679c4de7f2"} Apr 24 22:04:43.202360 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:43.202319 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" event={"ID":"8e836c2d-46d1-477b-9fe7-0a9606a69df8","Type":"ContainerStarted","Data":"b02c9353c56d647411b1a9b95d1bf7e1b6a96868c82f75e37ad667a286260f4a"} Apr 24 22:04:43.202360 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:43.202365 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" event={"ID":"8e836c2d-46d1-477b-9fe7-0a9606a69df8","Type":"ContainerStarted","Data":"0a249ead5f044786fdb9e8c7ef621fd6508f7600da9a8cdb8b4dc9abb59dd334"} Apr 24 22:04:43.202857 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:43.202749 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:43.202909 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:43.202888 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:43.204284 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:43.204258 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:04:43.225102 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:43.225058 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podStartSLOduration=6.225046044 podStartE2EDuration="6.225046044s" podCreationTimestamp="2026-04-24 22:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:04:43.224344979 +0000 UTC m=+2936.833483960" watchObservedRunningTime="2026-04-24 22:04:43.225046044 +0000 UTC m=+2936.834185078" Apr 24 22:04:44.206217 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:44.206175 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:04:47.095465 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:47.095426 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 22:04:47.095854 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:47.095543 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:04:49.210594 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:49.210566 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:04:49.211141 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:49.211114 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 22:04:52.095616 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:52.095578 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 22:04:57.096270 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:57.096227 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 22:04:59.211384 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:04:59.211353 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:05:02.095467 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:02.095420 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 22:05:07.086862 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:05:07.086827 2581 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd91356_b9fe_4d4b_9476_9634f7243f78.slice/crio-27a001e3ae4bba9e679328a85d57c4b206b989524ba4d992cbb7a0d22a7f4a26\": RecentStats: unable to find data in memory cache]" Apr 24 22:05:07.096162 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.096129 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 22:05:07.277236 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.277201 2581 generic.go:358] "Generic (PLEG): container finished" podID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerID="b615b645c2745dd41f520c5e375f2a7f7c3c0e9ccb43b0b55d9eeaad13cb8ee2" exitCode=137 Apr 24 22:05:07.277420 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.277273 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" event={"ID":"5dd91356-b9fe-4d4b-9476-9634f7243f78","Type":"ContainerDied","Data":"b615b645c2745dd41f520c5e375f2a7f7c3c0e9ccb43b0b55d9eeaad13cb8ee2"} Apr 24 22:05:07.711372 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.711347 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:05:07.856456 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.856339 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd91356-b9fe-4d4b-9476-9634f7243f78-kserve-provision-location\") pod \"5dd91356-b9fe-4d4b-9476-9634f7243f78\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " Apr 24 22:05:07.856456 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.856387 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dd91356-b9fe-4d4b-9476-9634f7243f78-proxy-tls\") pod \"5dd91356-b9fe-4d4b-9476-9634f7243f78\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " Apr 24 22:05:07.856456 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.856436 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dd91356-b9fe-4d4b-9476-9634f7243f78-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"5dd91356-b9fe-4d4b-9476-9634f7243f78\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " Apr 24 22:05:07.856684 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.856496 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvqlh\" (UniqueName: \"kubernetes.io/projected/5dd91356-b9fe-4d4b-9476-9634f7243f78-kube-api-access-tvqlh\") pod \"5dd91356-b9fe-4d4b-9476-9634f7243f78\" (UID: \"5dd91356-b9fe-4d4b-9476-9634f7243f78\") " Apr 24 22:05:07.856837 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.856793 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd91356-b9fe-4d4b-9476-9634f7243f78-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "5dd91356-b9fe-4d4b-9476-9634f7243f78" (UID: "5dd91356-b9fe-4d4b-9476-9634f7243f78"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:05:07.858586 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.858560 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd91356-b9fe-4d4b-9476-9634f7243f78-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5dd91356-b9fe-4d4b-9476-9634f7243f78" (UID: "5dd91356-b9fe-4d4b-9476-9634f7243f78"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:05:07.858680 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.858636 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd91356-b9fe-4d4b-9476-9634f7243f78-kube-api-access-tvqlh" (OuterVolumeSpecName: "kube-api-access-tvqlh") pod "5dd91356-b9fe-4d4b-9476-9634f7243f78" (UID: "5dd91356-b9fe-4d4b-9476-9634f7243f78"). InnerVolumeSpecName "kube-api-access-tvqlh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:05:07.870329 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.870301 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd91356-b9fe-4d4b-9476-9634f7243f78-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5dd91356-b9fe-4d4b-9476-9634f7243f78" (UID: "5dd91356-b9fe-4d4b-9476-9634f7243f78"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:07.957730 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.957692 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd91356-b9fe-4d4b-9476-9634f7243f78-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:05:07.957730 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.957723 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dd91356-b9fe-4d4b-9476-9634f7243f78-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:05:07.957730 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.957733 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dd91356-b9fe-4d4b-9476-9634f7243f78-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:05:07.958034 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:07.957744 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tvqlh\" (UniqueName: \"kubernetes.io/projected/5dd91356-b9fe-4d4b-9476-9634f7243f78-kube-api-access-tvqlh\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:05:08.282810 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:08.282784 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" Apr 24 22:05:08.283306 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:08.282783 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm" event={"ID":"5dd91356-b9fe-4d4b-9476-9634f7243f78","Type":"ContainerDied","Data":"27a001e3ae4bba9e679328a85d57c4b206b989524ba4d992cbb7a0d22a7f4a26"} Apr 24 22:05:08.283306 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:08.282919 2581 scope.go:117] "RemoveContainer" containerID="23252b964dc385221bfe175cc27bddaa6c5b35ac92cd42eb596a6f318d7c82e7" Apr 24 22:05:08.290695 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:08.290519 2581 scope.go:117] "RemoveContainer" containerID="b615b645c2745dd41f520c5e375f2a7f7c3c0e9ccb43b0b55d9eeaad13cb8ee2" Apr 24 22:05:08.297604 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:08.297588 2581 scope.go:117] "RemoveContainer" containerID="43d883cb93b2398e045cd04bcc73ae280fdf24c14eb26a2b8c9cce283b3e68f6" Apr 24 22:05:08.303458 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:08.303433 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm"] Apr 24 22:05:08.307194 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:08.307170 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-69whm"] Apr 24 22:05:08.949770 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:08.949739 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" path="/var/lib/kubelet/pods/5dd91356-b9fe-4d4b-9476-9634f7243f78/volumes" Apr 24 22:05:18.065876 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.065840 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d"] Apr 24 22:05:18.066290 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.066130 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kserve-container" containerID="cri-o://0a249ead5f044786fdb9e8c7ef621fd6508f7600da9a8cdb8b4dc9abb59dd334" gracePeriod=30 Apr 24 22:05:18.066290 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.066163 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" containerID="cri-o://b02c9353c56d647411b1a9b95d1bf7e1b6a96868c82f75e37ad667a286260f4a" gracePeriod=30 Apr 24 22:05:18.153910 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.153867 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v"] Apr 24 22:05:18.154222 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.154192 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kserve-container" Apr 24 22:05:18.154222 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.154213 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kserve-container" Apr 24 22:05:18.154350 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.154226 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="storage-initializer" Apr 24 22:05:18.154350 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.154232 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="storage-initializer" Apr 24 22:05:18.154350 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.154255 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" Apr 24 22:05:18.154350 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.154261 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" Apr 24 22:05:18.154350 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.154315 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kube-rbac-proxy" Apr 24 22:05:18.154350 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.154324 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dd91356-b9fe-4d4b-9476-9634f7243f78" containerName="kserve-container" Apr 24 22:05:18.158779 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.158761 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.160847 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.160826 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 24 22:05:18.160921 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.160850 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 24 22:05:18.169696 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.169675 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v"] Apr 24 22:05:18.238064 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.238023 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5l27\" (UniqueName: \"kubernetes.io/projected/8d3295c9-fd45-4290-916a-929197c3d174-kube-api-access-x5l27\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.238218 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.238071 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3295c9-fd45-4290-916a-929197c3d174-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.238218 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.238100 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3295c9-fd45-4290-916a-929197c3d174-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.238218 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.238181 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3295c9-fd45-4290-916a-929197c3d174-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.316765 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.316695 2581 generic.go:358] "Generic (PLEG): container finished" podID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerID="b02c9353c56d647411b1a9b95d1bf7e1b6a96868c82f75e37ad667a286260f4a" exitCode=2 Apr 24 22:05:18.316919 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.316754 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" event={"ID":"8e836c2d-46d1-477b-9fe7-0a9606a69df8","Type":"ContainerDied","Data":"b02c9353c56d647411b1a9b95d1bf7e1b6a96868c82f75e37ad667a286260f4a"} Apr 24 22:05:18.339188 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.339159 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3295c9-fd45-4290-916a-929197c3d174-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.339300 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.339206 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3295c9-fd45-4290-916a-929197c3d174-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.339300 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.339237 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5l27\" (UniqueName: \"kubernetes.io/projected/8d3295c9-fd45-4290-916a-929197c3d174-kube-api-access-x5l27\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.339377 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.339338 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3295c9-fd45-4290-916a-929197c3d174-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.339771 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.339742 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3295c9-fd45-4290-916a-929197c3d174-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.339931 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.339911 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3295c9-fd45-4290-916a-929197c3d174-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.341594 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.341574 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3295c9-fd45-4290-916a-929197c3d174-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.349672 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.349644 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5l27\" (UniqueName: \"kubernetes.io/projected/8d3295c9-fd45-4290-916a-929197c3d174-kube-api-access-x5l27\") pod \"isvc-triton-predictor-84bb65d94b-7nd6v\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.468544 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.468508 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:05:18.585203 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:18.585114 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v"] Apr 24 22:05:18.587720 ip-10-0-132-159 kubenswrapper[2581]: W0424 22:05:18.587682 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d3295c9_fd45_4290_916a_929197c3d174.slice/crio-b11392b25ade96a274932d795c7fe65c1415c897924ad24c4116159e1e418cad WatchSource:0}: Error finding container b11392b25ade96a274932d795c7fe65c1415c897924ad24c4116159e1e418cad: Status 404 returned error can't find the container with id b11392b25ade96a274932d795c7fe65c1415c897924ad24c4116159e1e418cad Apr 24 22:05:19.206699 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:19.206658 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 24 22:05:19.321946 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:19.321910 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" event={"ID":"8d3295c9-fd45-4290-916a-929197c3d174","Type":"ContainerStarted","Data":"c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540"} Apr 24 22:05:19.321946 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:19.321948 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" event={"ID":"8d3295c9-fd45-4290-916a-929197c3d174","Type":"ContainerStarted","Data":"b11392b25ade96a274932d795c7fe65c1415c897924ad24c4116159e1e418cad"} Apr 24 22:05:23.336268 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:23.336231 2581 generic.go:358] "Generic (PLEG): container finished" podID="8d3295c9-fd45-4290-916a-929197c3d174" containerID="c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540" exitCode=0 Apr 24 22:05:23.336268 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:23.336270 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" event={"ID":"8d3295c9-fd45-4290-916a-929197c3d174","Type":"ContainerDied","Data":"c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540"} Apr 24 22:05:24.207267 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:24.207226 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 24 22:05:29.207892 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:29.207810 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 24 22:05:29.208359 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:29.207959 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:05:34.206661 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:34.206615 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 24 22:05:39.207179 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:39.206665 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 24 22:05:44.206984 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:44.206934 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 24 22:05:47.047311 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:47.047274 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:05:47.053227 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:47.053192 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:05:48.452549 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.452434 2581 generic.go:358] "Generic (PLEG): container finished" podID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerID="0a249ead5f044786fdb9e8c7ef621fd6508f7600da9a8cdb8b4dc9abb59dd334" exitCode=137 Apr 24 22:05:48.452549 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.452490 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" event={"ID":"8e836c2d-46d1-477b-9fe7-0a9606a69df8","Type":"ContainerDied","Data":"0a249ead5f044786fdb9e8c7ef621fd6508f7600da9a8cdb8b4dc9abb59dd334"} Apr 24 22:05:48.757224 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.757180 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:05:48.819743 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.819704 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e836c2d-46d1-477b-9fe7-0a9606a69df8-proxy-tls\") pod \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " Apr 24 22:05:48.819915 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.819753 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvzt2\" (UniqueName: \"kubernetes.io/projected/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kube-api-access-lvzt2\") pod \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " Apr 24 22:05:48.819915 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.819788 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kserve-provision-location\") pod \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " Apr 24 22:05:48.819915 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.819818 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8e836c2d-46d1-477b-9fe7-0a9606a69df8-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\" (UID: \"8e836c2d-46d1-477b-9fe7-0a9606a69df8\") " Apr 24 22:05:48.820309 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.820274 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e836c2d-46d1-477b-9fe7-0a9606a69df8-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "8e836c2d-46d1-477b-9fe7-0a9606a69df8" (UID: "8e836c2d-46d1-477b-9fe7-0a9606a69df8"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:05:48.822907 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.822879 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kube-api-access-lvzt2" (OuterVolumeSpecName: "kube-api-access-lvzt2") pod "8e836c2d-46d1-477b-9fe7-0a9606a69df8" (UID: "8e836c2d-46d1-477b-9fe7-0a9606a69df8"). InnerVolumeSpecName "kube-api-access-lvzt2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:05:48.823628 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.823601 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e836c2d-46d1-477b-9fe7-0a9606a69df8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8e836c2d-46d1-477b-9fe7-0a9606a69df8" (UID: "8e836c2d-46d1-477b-9fe7-0a9606a69df8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:05:48.831965 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.831933 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8e836c2d-46d1-477b-9fe7-0a9606a69df8" (UID: "8e836c2d-46d1-477b-9fe7-0a9606a69df8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:48.921115 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.921069 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e836c2d-46d1-477b-9fe7-0a9606a69df8-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:05:48.921115 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.921107 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvzt2\" (UniqueName: \"kubernetes.io/projected/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kube-api-access-lvzt2\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:05:48.921350 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.921125 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e836c2d-46d1-477b-9fe7-0a9606a69df8-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:05:48.921350 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:48.921140 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8e836c2d-46d1-477b-9fe7-0a9606a69df8-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:05:49.458454 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:49.458414 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" event={"ID":"8e836c2d-46d1-477b-9fe7-0a9606a69df8","Type":"ContainerDied","Data":"994ef6514bf3477acaf50ad6e52bc8c677b3ff2189d6a118ce817bd654bcd744"} Apr 24 22:05:49.458916 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:49.458474 2581 scope.go:117] "RemoveContainer" containerID="b02c9353c56d647411b1a9b95d1bf7e1b6a96868c82f75e37ad667a286260f4a" Apr 24 22:05:49.458916 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:49.458486 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d" Apr 24 22:05:49.469920 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:49.469902 2581 scope.go:117] "RemoveContainer" containerID="0a249ead5f044786fdb9e8c7ef621fd6508f7600da9a8cdb8b4dc9abb59dd334" Apr 24 22:05:49.477934 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:49.477583 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d"] Apr 24 22:05:49.481500 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:49.481342 2581 scope.go:117] "RemoveContainer" containerID="92753dbc20c04ac5c691b33b4e4e98258f17b6cd66d634320df46f679c4de7f2" Apr 24 22:05:49.483946 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:49.483314 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-mgt2d"] Apr 24 22:05:50.951674 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:05:50.951639 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" path="/var/lib/kubelet/pods/8e836c2d-46d1-477b-9fe7-0a9606a69df8/volumes" Apr 24 22:07:16.754070 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:16.754032 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" event={"ID":"8d3295c9-fd45-4290-916a-929197c3d174","Type":"ContainerStarted","Data":"7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57"} Apr 24 22:07:16.754571 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:16.754078 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" event={"ID":"8d3295c9-fd45-4290-916a-929197c3d174","Type":"ContainerStarted","Data":"7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e"} Apr 24 22:07:16.754571 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:16.754173 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:07:16.780184 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:16.775505 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" podStartSLOduration=6.389026041 podStartE2EDuration="1m58.775477753s" podCreationTimestamp="2026-04-24 22:05:18 +0000 UTC" firstStartedPulling="2026-04-24 22:05:23.337274086 +0000 UTC m=+2976.946413042" lastFinishedPulling="2026-04-24 22:07:15.723725796 +0000 UTC m=+3089.332864754" observedRunningTime="2026-04-24 22:07:16.773701923 +0000 UTC m=+3090.382840902" watchObservedRunningTime="2026-04-24 22:07:16.775477753 +0000 UTC m=+3090.384616733" Apr 24 22:07:17.757458 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:17.757422 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:07:17.758682 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:17.758650 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 22:07:18.759997 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:18.759958 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 22:07:23.764196 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:23.764161 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:07:23.764978 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:23.764958 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:07:29.834732 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.834699 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v"] Apr 24 22:07:29.835254 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.835063 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kserve-container" containerID="cri-o://7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e" gracePeriod=30 Apr 24 22:07:29.835254 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.835105 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kube-rbac-proxy" containerID="cri-o://7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57" gracePeriod=30 Apr 24 22:07:29.930836 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.930790 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m"] Apr 24 22:07:29.931086 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.931073 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="storage-initializer" Apr 24 22:07:29.931141 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.931088 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="storage-initializer" Apr 24 22:07:29.931141 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.931103 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kserve-container" Apr 24 22:07:29.931141 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.931109 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kserve-container" Apr 24 22:07:29.931141 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.931115 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" Apr 24 22:07:29.931141 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.931121 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" Apr 24 22:07:29.931313 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.931175 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kube-rbac-proxy" Apr 24 22:07:29.931313 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.931183 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e836c2d-46d1-477b-9fe7-0a9606a69df8" containerName="kserve-container" Apr 24 22:07:29.962188 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.962163 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m"] Apr 24 22:07:29.962335 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.962319 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:29.966135 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.966115 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 24 22:07:29.966290 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:29.966122 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 22:07:30.057857 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.057827 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.058043 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.057970 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92r6g\" (UniqueName: \"kubernetes.io/projected/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kube-api-access-92r6g\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.058119 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.058062 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69d7ea50-6dd5-4c39-977a-64d633bec4fd-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.058119 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.058108 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69d7ea50-6dd5-4c39-977a-64d633bec4fd-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.159154 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.159118 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69d7ea50-6dd5-4c39-977a-64d633bec4fd-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.159345 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.159165 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69d7ea50-6dd5-4c39-977a-64d633bec4fd-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.159345 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.159208 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.159345 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.159240 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92r6g\" (UniqueName: \"kubernetes.io/projected/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kube-api-access-92r6g\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.159690 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.159657 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.159988 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.159965 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69d7ea50-6dd5-4c39-977a-64d633bec4fd-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.161597 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.161580 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69d7ea50-6dd5-4c39-977a-64d633bec4fd-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.167762 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.167742 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92r6g\" (UniqueName: \"kubernetes.io/projected/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kube-api-access-92r6g\") pod \"isvc-xgboost-predictor-8689c4cfcc-lsk5m\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.271889 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.271852 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:30.391478 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.391451 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m"] Apr 24 22:07:30.394075 ip-10-0-132-159 kubenswrapper[2581]: W0424 22:07:30.394046 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d7ea50_6dd5_4c39_977a_64d633bec4fd.slice/crio-241630de09b33d231965ca7c2f801ca713292574ac32be82f4d2bade0890be04 WatchSource:0}: Error finding container 241630de09b33d231965ca7c2f801ca713292574ac32be82f4d2bade0890be04: Status 404 returned error can't find the container with id 241630de09b33d231965ca7c2f801ca713292574ac32be82f4d2bade0890be04 Apr 24 22:07:30.792821 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.792725 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" event={"ID":"69d7ea50-6dd5-4c39-977a-64d633bec4fd","Type":"ContainerStarted","Data":"603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d"} Apr 24 22:07:30.792821 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.792778 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" event={"ID":"69d7ea50-6dd5-4c39-977a-64d633bec4fd","Type":"ContainerStarted","Data":"241630de09b33d231965ca7c2f801ca713292574ac32be82f4d2bade0890be04"} Apr 24 22:07:30.794551 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.794527 2581 generic.go:358] "Generic (PLEG): container finished" podID="8d3295c9-fd45-4290-916a-929197c3d174" containerID="7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57" exitCode=2 Apr 24 22:07:30.794690 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:30.794557 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" event={"ID":"8d3295c9-fd45-4290-916a-929197c3d174","Type":"ContainerDied","Data":"7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57"} Apr 24 22:07:32.573439 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.573412 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:07:32.680218 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.680177 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3295c9-fd45-4290-916a-929197c3d174-isvc-triton-kube-rbac-proxy-sar-config\") pod \"8d3295c9-fd45-4290-916a-929197c3d174\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " Apr 24 22:07:32.680420 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.680241 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3295c9-fd45-4290-916a-929197c3d174-proxy-tls\") pod \"8d3295c9-fd45-4290-916a-929197c3d174\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " Apr 24 22:07:32.680420 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.680360 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3295c9-fd45-4290-916a-929197c3d174-kserve-provision-location\") pod \"8d3295c9-fd45-4290-916a-929197c3d174\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " Apr 24 22:07:32.680524 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.680473 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5l27\" (UniqueName: \"kubernetes.io/projected/8d3295c9-fd45-4290-916a-929197c3d174-kube-api-access-x5l27\") pod \"8d3295c9-fd45-4290-916a-929197c3d174\" (UID: \"8d3295c9-fd45-4290-916a-929197c3d174\") " Apr 24 22:07:32.680632 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.680609 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d3295c9-fd45-4290-916a-929197c3d174-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "8d3295c9-fd45-4290-916a-929197c3d174" (UID: "8d3295c9-fd45-4290-916a-929197c3d174"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:07:32.680845 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.680822 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d3295c9-fd45-4290-916a-929197c3d174-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d3295c9-fd45-4290-916a-929197c3d174" (UID: "8d3295c9-fd45-4290-916a-929197c3d174"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:07:32.682552 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.682519 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d3295c9-fd45-4290-916a-929197c3d174-kube-api-access-x5l27" (OuterVolumeSpecName: "kube-api-access-x5l27") pod "8d3295c9-fd45-4290-916a-929197c3d174" (UID: "8d3295c9-fd45-4290-916a-929197c3d174"). InnerVolumeSpecName "kube-api-access-x5l27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:07:32.682651 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.682587 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d3295c9-fd45-4290-916a-929197c3d174-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8d3295c9-fd45-4290-916a-929197c3d174" (UID: "8d3295c9-fd45-4290-916a-929197c3d174"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:07:32.781440 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.781375 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d3295c9-fd45-4290-916a-929197c3d174-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:07:32.781440 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.781436 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x5l27\" (UniqueName: \"kubernetes.io/projected/8d3295c9-fd45-4290-916a-929197c3d174-kube-api-access-x5l27\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:07:32.781650 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.781452 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d3295c9-fd45-4290-916a-929197c3d174-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:07:32.781650 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.781467 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d3295c9-fd45-4290-916a-929197c3d174-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:07:32.804132 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.804099 2581 generic.go:358] "Generic (PLEG): container finished" podID="8d3295c9-fd45-4290-916a-929197c3d174" containerID="7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e" exitCode=0 Apr 24 22:07:32.804279 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.804136 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" event={"ID":"8d3295c9-fd45-4290-916a-929197c3d174","Type":"ContainerDied","Data":"7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e"} Apr 24 22:07:32.804279 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.804161 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" event={"ID":"8d3295c9-fd45-4290-916a-929197c3d174","Type":"ContainerDied","Data":"b11392b25ade96a274932d795c7fe65c1415c897924ad24c4116159e1e418cad"} Apr 24 22:07:32.804279 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.804176 2581 scope.go:117] "RemoveContainer" containerID="7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57" Apr 24 22:07:32.804279 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.804177 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v" Apr 24 22:07:32.811615 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.811589 2581 scope.go:117] "RemoveContainer" containerID="7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e" Apr 24 22:07:32.818538 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.818516 2581 scope.go:117] "RemoveContainer" containerID="c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540" Apr 24 22:07:32.825327 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.825307 2581 scope.go:117] "RemoveContainer" containerID="7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57" Apr 24 22:07:32.825610 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:07:32.825585 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57\": container with ID starting with 7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57 not found: ID does not exist" containerID="7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57" Apr 24 22:07:32.825702 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.825620 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57"} err="failed to get container status \"7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57\": rpc error: code = NotFound desc = could not find container \"7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57\": container with ID starting with 7d5fe9fa0466fc90490cabc5530bee74934949edae1e1ea785b199015d34fe57 not found: ID does not exist" Apr 24 22:07:32.825702 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.825645 2581 scope.go:117] "RemoveContainer" containerID="7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e" Apr 24 22:07:32.825915 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:07:32.825891 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e\": container with ID starting with 7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e not found: ID does not exist" containerID="7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e" Apr 24 22:07:32.825967 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.825921 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e"} err="failed to get container status \"7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e\": rpc error: code = NotFound desc = could not find container \"7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e\": container with ID starting with 7e58e88f78dd22aea1dcd80fca45877e5b5e9d34865b91fbca0ee9ec292d402e not found: ID does not exist" Apr 24 22:07:32.825967 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.825940 2581 scope.go:117] "RemoveContainer" containerID="c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540" Apr 24 22:07:32.826138 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.826122 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v"] Apr 24 22:07:32.826219 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:07:32.826191 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540\": container with ID starting with c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540 not found: ID does not exist" containerID="c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540" Apr 24 22:07:32.826259 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.826226 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540"} err="failed to get container status \"c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540\": rpc error: code = NotFound desc = could not find container \"c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540\": container with ID starting with c102b3789697457aa9fd31dfd9a7eb83c8a0d6008090bd5ce0a9c4a66b573540 not found: ID does not exist" Apr 24 22:07:32.830041 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.830022 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-7nd6v"] Apr 24 22:07:32.950121 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:32.950047 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d3295c9-fd45-4290-916a-929197c3d174" path="/var/lib/kubelet/pods/8d3295c9-fd45-4290-916a-929197c3d174/volumes" Apr 24 22:07:34.812652 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:34.812619 2581 generic.go:358] "Generic (PLEG): container finished" podID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerID="603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d" exitCode=0 Apr 24 22:07:34.813030 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:34.812666 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" event={"ID":"69d7ea50-6dd5-4c39-977a-64d633bec4fd","Type":"ContainerDied","Data":"603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d"} Apr 24 22:07:54.879334 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:54.879298 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" event={"ID":"69d7ea50-6dd5-4c39-977a-64d633bec4fd","Type":"ContainerStarted","Data":"944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c"} Apr 24 22:07:54.879334 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:54.879342 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" event={"ID":"69d7ea50-6dd5-4c39-977a-64d633bec4fd","Type":"ContainerStarted","Data":"b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881"} Apr 24 22:07:54.879802 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:54.879653 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:54.879802 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:54.879694 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:07:54.881003 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:54.880975 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 22:07:54.899137 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:54.899079 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podStartSLOduration=6.93529167 podStartE2EDuration="25.899067308s" podCreationTimestamp="2026-04-24 22:07:29 +0000 UTC" firstStartedPulling="2026-04-24 22:07:34.813914212 +0000 UTC m=+3108.423053169" lastFinishedPulling="2026-04-24 22:07:53.777689847 +0000 UTC m=+3127.386828807" observedRunningTime="2026-04-24 22:07:54.897839663 +0000 UTC m=+3128.506978636" watchObservedRunningTime="2026-04-24 22:07:54.899067308 +0000 UTC m=+3128.508206288" Apr 24 22:07:55.882149 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:07:55.882104 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 22:08:00.886722 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:08:00.886692 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:08:00.887297 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:08:00.887272 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 22:08:10.887274 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:08:10.887187 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 22:08:20.887629 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:08:20.887592 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 22:08:30.887549 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:08:30.887504 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 22:08:40.887464 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:08:40.887423 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 22:08:50.887456 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:08:50.887413 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 22:09:00.888206 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:00.888175 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:09:10.032322 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:10.032287 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m"] Apr 24 22:09:10.033468 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:10.033407 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" containerID="cri-o://b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881" gracePeriod=30 Apr 24 22:09:10.033765 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:10.033716 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kube-rbac-proxy" containerID="cri-o://944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c" gracePeriod=30 Apr 24 22:09:10.882591 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:10.882551 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 24 22:09:10.887420 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:10.887375 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 22:09:11.100611 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:11.100577 2581 generic.go:358] "Generic (PLEG): container finished" podID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerID="944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c" exitCode=2 Apr 24 22:09:11.100980 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:11.100648 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" event={"ID":"69d7ea50-6dd5-4c39-977a-64d633bec4fd","Type":"ContainerDied","Data":"944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c"} Apr 24 22:09:13.669930 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.669908 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:09:13.760970 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.760930 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92r6g\" (UniqueName: \"kubernetes.io/projected/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kube-api-access-92r6g\") pod \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " Apr 24 22:09:13.761148 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.760978 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kserve-provision-location\") pod \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " Apr 24 22:09:13.761148 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.761110 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69d7ea50-6dd5-4c39-977a-64d633bec4fd-proxy-tls\") pod \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " Apr 24 22:09:13.761261 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.761158 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69d7ea50-6dd5-4c39-977a-64d633bec4fd-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\" (UID: \"69d7ea50-6dd5-4c39-977a-64d633bec4fd\") " Apr 24 22:09:13.761323 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.761287 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "69d7ea50-6dd5-4c39-977a-64d633bec4fd" (UID: "69d7ea50-6dd5-4c39-977a-64d633bec4fd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:13.761547 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.761523 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d7ea50-6dd5-4c39-977a-64d633bec4fd-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "69d7ea50-6dd5-4c39-977a-64d633bec4fd" (UID: "69d7ea50-6dd5-4c39-977a-64d633bec4fd"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:09:13.763059 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.763026 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kube-api-access-92r6g" (OuterVolumeSpecName: "kube-api-access-92r6g") pod "69d7ea50-6dd5-4c39-977a-64d633bec4fd" (UID: "69d7ea50-6dd5-4c39-977a-64d633bec4fd"). InnerVolumeSpecName "kube-api-access-92r6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:09:13.763150 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.763062 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d7ea50-6dd5-4c39-977a-64d633bec4fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "69d7ea50-6dd5-4c39-977a-64d633bec4fd" (UID: "69d7ea50-6dd5-4c39-977a-64d633bec4fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:09:13.862069 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.861972 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-92r6g\" (UniqueName: \"kubernetes.io/projected/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kube-api-access-92r6g\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:09:13.862069 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.862022 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69d7ea50-6dd5-4c39-977a-64d633bec4fd-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:09:13.862069 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.862033 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69d7ea50-6dd5-4c39-977a-64d633bec4fd-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:09:13.862069 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:13.862042 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/69d7ea50-6dd5-4c39-977a-64d633bec4fd-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:09:14.111643 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.111611 2581 generic.go:358] "Generic (PLEG): container finished" podID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerID="b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881" exitCode=0 Apr 24 22:09:14.111823 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.111696 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" Apr 24 22:09:14.111823 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.111714 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" event={"ID":"69d7ea50-6dd5-4c39-977a-64d633bec4fd","Type":"ContainerDied","Data":"b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881"} Apr 24 22:09:14.111823 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.111763 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m" event={"ID":"69d7ea50-6dd5-4c39-977a-64d633bec4fd","Type":"ContainerDied","Data":"241630de09b33d231965ca7c2f801ca713292574ac32be82f4d2bade0890be04"} Apr 24 22:09:14.111823 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.111785 2581 scope.go:117] "RemoveContainer" containerID="944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c" Apr 24 22:09:14.120219 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.120203 2581 scope.go:117] "RemoveContainer" containerID="b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881" Apr 24 22:09:14.126869 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.126848 2581 scope.go:117] "RemoveContainer" containerID="603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d" Apr 24 22:09:14.135765 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.135741 2581 scope.go:117] "RemoveContainer" containerID="944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c" Apr 24 22:09:14.135857 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.135836 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m"] Apr 24 22:09:14.136031 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:09:14.136010 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c\": container with ID starting with 944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c not found: ID does not exist" containerID="944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c" Apr 24 22:09:14.136079 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.136039 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c"} err="failed to get container status \"944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c\": rpc error: code = NotFound desc = could not find container \"944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c\": container with ID starting with 944dd33352c24f160423819596370286657f4b0c14b5ede7721703c7c96bbf5c not found: ID does not exist" Apr 24 22:09:14.136079 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.136057 2581 scope.go:117] "RemoveContainer" containerID="b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881" Apr 24 22:09:14.136330 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:09:14.136297 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881\": container with ID starting with b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881 not found: ID does not exist" containerID="b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881" Apr 24 22:09:14.136415 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.136340 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881"} err="failed to get container status \"b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881\": rpc error: code = NotFound desc = could not find container \"b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881\": container with ID starting with b7762cf1ec642067266212174541e2ea3684175a383926ad87bd610f3131f881 not found: ID does not exist" Apr 24 22:09:14.136415 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.136361 2581 scope.go:117] "RemoveContainer" containerID="603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d" Apr 24 22:09:14.136627 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:09:14.136612 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d\": container with ID starting with 603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d not found: ID does not exist" containerID="603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d" Apr 24 22:09:14.136688 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.136631 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d"} err="failed to get container status \"603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d\": rpc error: code = NotFound desc = could not find container \"603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d\": container with ID starting with 603a8ee2f4c5fd47472c43538cba600533d0a93ed9572eebc394ff13ce50bd9d not found: ID does not exist" Apr 24 22:09:14.138982 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.138951 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-lsk5m"] Apr 24 22:09:14.950074 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:09:14.950042 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" path="/var/lib/kubelet/pods/69d7ea50-6dd5-4c39-977a-64d633bec4fd/volumes" Apr 24 22:10:47.068388 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:47.068355 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:10:47.073816 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:47.073796 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:10:50.685280 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685247 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt"] Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685565 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kube-rbac-proxy" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685577 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kube-rbac-proxy" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685589 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kube-rbac-proxy" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685595 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kube-rbac-proxy" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685602 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kserve-container" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685608 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kserve-container" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685618 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="storage-initializer" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685624 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="storage-initializer" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685629 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685634 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685639 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="storage-initializer" Apr 24 22:10:50.685671 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685645 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="storage-initializer" Apr 24 22:10:50.686034 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685689 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kube-rbac-proxy" Apr 24 22:10:50.686034 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685698 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kube-rbac-proxy" Apr 24 22:10:50.686034 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685704 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d3295c9-fd45-4290-916a-929197c3d174" containerName="kserve-container" Apr 24 22:10:50.686034 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.685711 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="69d7ea50-6dd5-4c39-977a-64d633bec4fd" containerName="kserve-container" Apr 24 22:10:50.688708 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.688691 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.691256 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.691225 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 24 22:10:50.691382 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.691230 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:10:50.691382 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.691326 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:10:50.691802 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.691785 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:10:50.692408 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.692370 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 22:10:50.701896 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.701872 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt"] Apr 24 22:10:50.781710 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.781655 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d26ebb1-8df4-4c22-af17-8b91691461b7-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.781710 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.781705 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5q2p\" (UniqueName: \"kubernetes.io/projected/2d26ebb1-8df4-4c22-af17-8b91691461b7-kube-api-access-j5q2p\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.781949 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.781735 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d26ebb1-8df4-4c22-af17-8b91691461b7-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.781949 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.781813 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d26ebb1-8df4-4c22-af17-8b91691461b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.882222 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.882172 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d26ebb1-8df4-4c22-af17-8b91691461b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.882474 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.882247 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d26ebb1-8df4-4c22-af17-8b91691461b7-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.882474 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.882307 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5q2p\" (UniqueName: \"kubernetes.io/projected/2d26ebb1-8df4-4c22-af17-8b91691461b7-kube-api-access-j5q2p\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.882474 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.882351 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d26ebb1-8df4-4c22-af17-8b91691461b7-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.882741 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.882721 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d26ebb1-8df4-4c22-af17-8b91691461b7-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.882886 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.882868 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d26ebb1-8df4-4c22-af17-8b91691461b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.884742 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.884721 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d26ebb1-8df4-4c22-af17-8b91691461b7-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.890794 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.890759 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5q2p\" (UniqueName: \"kubernetes.io/projected/2d26ebb1-8df4-4c22-af17-8b91691461b7-kube-api-access-j5q2p\") pod \"isvc-xgboost-runtime-predictor-779db84d9-4z5rt\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:50.999378 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:50.999288 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:51.118772 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:51.118738 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt"] Apr 24 22:10:51.121777 ip-10-0-132-159 kubenswrapper[2581]: W0424 22:10:51.121749 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d26ebb1_8df4_4c22_af17_8b91691461b7.slice/crio-d5d75cb10c40e44fa0e7bce7c91bce08c81ca34fa07a0c8df93b795860ede9a5 WatchSource:0}: Error finding container d5d75cb10c40e44fa0e7bce7c91bce08c81ca34fa07a0c8df93b795860ede9a5: Status 404 returned error can't find the container with id d5d75cb10c40e44fa0e7bce7c91bce08c81ca34fa07a0c8df93b795860ede9a5 Apr 24 22:10:51.123506 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:51.123488 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:10:51.385205 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:51.385175 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" event={"ID":"2d26ebb1-8df4-4c22-af17-8b91691461b7","Type":"ContainerStarted","Data":"7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc"} Apr 24 22:10:51.385367 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:51.385210 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" event={"ID":"2d26ebb1-8df4-4c22-af17-8b91691461b7","Type":"ContainerStarted","Data":"d5d75cb10c40e44fa0e7bce7c91bce08c81ca34fa07a0c8df93b795860ede9a5"} Apr 24 22:10:55.403316 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:55.403283 2581 generic.go:358] "Generic (PLEG): container finished" podID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerID="7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc" exitCode=0 Apr 24 22:10:55.403965 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:55.403902 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" event={"ID":"2d26ebb1-8df4-4c22-af17-8b91691461b7","Type":"ContainerDied","Data":"7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc"} Apr 24 22:10:56.408133 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:56.408098 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" event={"ID":"2d26ebb1-8df4-4c22-af17-8b91691461b7","Type":"ContainerStarted","Data":"5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6"} Apr 24 22:10:56.408133 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:56.408137 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" event={"ID":"2d26ebb1-8df4-4c22-af17-8b91691461b7","Type":"ContainerStarted","Data":"0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed"} Apr 24 22:10:56.408560 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:56.408333 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:56.432737 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:56.432690 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podStartSLOduration=6.432674838 podStartE2EDuration="6.432674838s" podCreationTimestamp="2026-04-24 22:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:10:56.431428411 +0000 UTC m=+3310.040567394" watchObservedRunningTime="2026-04-24 22:10:56.432674838 +0000 UTC m=+3310.041813818" Apr 24 22:10:57.411374 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:57.411335 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:10:57.412484 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:57.412453 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 22:10:58.414062 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:10:58.414024 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 22:11:03.418760 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:11:03.418687 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:11:03.419306 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:11:03.419279 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 22:11:13.420121 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:11:13.420083 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 22:11:23.419682 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:11:23.419646 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 22:11:33.420228 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:11:33.420187 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 22:11:43.419618 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:11:43.419578 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 22:11:53.419738 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:11:53.419690 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 22:12:03.420216 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:03.420188 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:12:10.580556 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:10.580517 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt"] Apr 24 22:12:10.581129 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:10.580846 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" containerID="cri-o://0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed" gracePeriod=30 Apr 24 22:12:10.581129 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:10.580882 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kube-rbac-proxy" containerID="cri-o://5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6" gracePeriod=30 Apr 24 22:12:11.618343 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:11.618309 2581 generic.go:358] "Generic (PLEG): container finished" podID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerID="5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6" exitCode=2 Apr 24 22:12:11.618343 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:11.618349 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" event={"ID":"2d26ebb1-8df4-4c22-af17-8b91691461b7","Type":"ContainerDied","Data":"5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6"} Apr 24 22:12:13.415175 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:13.415127 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 24 22:12:13.419452 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:13.419417 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 22:12:14.314745 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.314719 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:12:14.416870 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.416835 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d26ebb1-8df4-4c22-af17-8b91691461b7-proxy-tls\") pod \"2d26ebb1-8df4-4c22-af17-8b91691461b7\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " Apr 24 22:12:14.417278 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.416889 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d26ebb1-8df4-4c22-af17-8b91691461b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"2d26ebb1-8df4-4c22-af17-8b91691461b7\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " Apr 24 22:12:14.417278 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.416941 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d26ebb1-8df4-4c22-af17-8b91691461b7-kserve-provision-location\") pod \"2d26ebb1-8df4-4c22-af17-8b91691461b7\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " Apr 24 22:12:14.417278 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.416963 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5q2p\" (UniqueName: \"kubernetes.io/projected/2d26ebb1-8df4-4c22-af17-8b91691461b7-kube-api-access-j5q2p\") pod \"2d26ebb1-8df4-4c22-af17-8b91691461b7\" (UID: \"2d26ebb1-8df4-4c22-af17-8b91691461b7\") " Apr 24 22:12:14.417278 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.417196 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d26ebb1-8df4-4c22-af17-8b91691461b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2d26ebb1-8df4-4c22-af17-8b91691461b7" (UID: "2d26ebb1-8df4-4c22-af17-8b91691461b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:14.417278 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.417240 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d26ebb1-8df4-4c22-af17-8b91691461b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "2d26ebb1-8df4-4c22-af17-8b91691461b7" (UID: "2d26ebb1-8df4-4c22-af17-8b91691461b7"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:12:14.418843 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.418822 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d26ebb1-8df4-4c22-af17-8b91691461b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2d26ebb1-8df4-4c22-af17-8b91691461b7" (UID: "2d26ebb1-8df4-4c22-af17-8b91691461b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:12:14.418935 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.418909 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d26ebb1-8df4-4c22-af17-8b91691461b7-kube-api-access-j5q2p" (OuterVolumeSpecName: "kube-api-access-j5q2p") pod "2d26ebb1-8df4-4c22-af17-8b91691461b7" (UID: "2d26ebb1-8df4-4c22-af17-8b91691461b7"). InnerVolumeSpecName "kube-api-access-j5q2p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:12:14.518475 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.518426 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d26ebb1-8df4-4c22-af17-8b91691461b7-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:12:14.518475 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.518468 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j5q2p\" (UniqueName: \"kubernetes.io/projected/2d26ebb1-8df4-4c22-af17-8b91691461b7-kube-api-access-j5q2p\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:12:14.518475 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.518479 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d26ebb1-8df4-4c22-af17-8b91691461b7-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:12:14.518475 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.518490 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d26ebb1-8df4-4c22-af17-8b91691461b7-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:12:14.627142 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.627109 2581 generic.go:358] "Generic (PLEG): container finished" podID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerID="0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed" exitCode=0 Apr 24 22:12:14.627294 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.627200 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" Apr 24 22:12:14.627294 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.627199 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" event={"ID":"2d26ebb1-8df4-4c22-af17-8b91691461b7","Type":"ContainerDied","Data":"0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed"} Apr 24 22:12:14.627294 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.627244 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt" event={"ID":"2d26ebb1-8df4-4c22-af17-8b91691461b7","Type":"ContainerDied","Data":"d5d75cb10c40e44fa0e7bce7c91bce08c81ca34fa07a0c8df93b795860ede9a5"} Apr 24 22:12:14.627294 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.627266 2581 scope.go:117] "RemoveContainer" containerID="5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6" Apr 24 22:12:14.640575 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.640549 2581 scope.go:117] "RemoveContainer" containerID="0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed" Apr 24 22:12:14.647331 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.647311 2581 scope.go:117] "RemoveContainer" containerID="7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc" Apr 24 22:12:14.653970 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.653947 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt"] Apr 24 22:12:14.654246 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.654231 2581 scope.go:117] "RemoveContainer" containerID="5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6" Apr 24 22:12:14.654598 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:12:14.654575 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6\": container with ID starting with 5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6 not found: ID does not exist" containerID="5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6" Apr 24 22:12:14.654674 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.654607 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6"} err="failed to get container status \"5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6\": rpc error: code = NotFound desc = could not find container \"5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6\": container with ID starting with 5fded3742a0369445bf2503fb270abb2b53b8c47c657bf3587cc5ae37bed76c6 not found: ID does not exist" Apr 24 22:12:14.654674 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.654625 2581 scope.go:117] "RemoveContainer" containerID="0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed" Apr 24 22:12:14.654862 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:12:14.654844 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed\": container with ID starting with 0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed not found: ID does not exist" containerID="0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed" Apr 24 22:12:14.654905 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.654867 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed"} err="failed to get container status \"0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed\": rpc error: code = NotFound desc = could not find container \"0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed\": container with ID starting with 0a3063d602680e4fa57eefdf118840765ef17851e9d2e28d71f21da79f4a74ed not found: ID does not exist" Apr 24 22:12:14.654905 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.654882 2581 scope.go:117] "RemoveContainer" containerID="7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc" Apr 24 22:12:14.655120 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:12:14.655102 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc\": container with ID starting with 7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc not found: ID does not exist" containerID="7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc" Apr 24 22:12:14.655177 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.655124 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc"} err="failed to get container status \"7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc\": rpc error: code = NotFound desc = could not find container \"7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc\": container with ID starting with 7b7511a5d2c4d36c862fe69074284bae685f0d11a634fecd172ad91f5f40c0bc not found: ID does not exist" Apr 24 22:12:14.657650 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.657630 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-4z5rt"] Apr 24 22:12:14.949627 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:12:14.949594 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" path="/var/lib/kubelet/pods/2d26ebb1-8df4-4c22-af17-8b91691461b7/volumes" Apr 24 22:13:00.826640 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.826608 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw"] Apr 24 22:13:00.827109 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.826892 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" Apr 24 22:13:00.827109 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.826904 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" Apr 24 22:13:00.827109 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.826913 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="storage-initializer" Apr 24 22:13:00.827109 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.826918 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="storage-initializer" Apr 24 22:13:00.827109 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.826937 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kube-rbac-proxy" Apr 24 22:13:00.827109 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.826942 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kube-rbac-proxy" Apr 24 22:13:00.827109 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.826980 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kserve-container" Apr 24 22:13:00.827109 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.826990 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d26ebb1-8df4-4c22-af17-8b91691461b7" containerName="kube-rbac-proxy" Apr 24 22:13:00.829931 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.829915 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.832185 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.832160 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 24 22:13:00.833594 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.833569 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:13:00.833594 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.833581 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:13:00.833758 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.833607 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:13:00.833758 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.833607 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8q48m\"" Apr 24 22:13:00.840187 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.840165 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw"] Apr 24 22:13:00.878081 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.878054 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6caf0557-ee59-412e-8376-709591b94d3a-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.878183 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.878087 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6caf0557-ee59-412e-8376-709591b94d3a-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.878183 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.878116 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6caf0557-ee59-412e-8376-709591b94d3a-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.878252 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.878192 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpc44\" (UniqueName: \"kubernetes.io/projected/6caf0557-ee59-412e-8376-709591b94d3a-kube-api-access-mpc44\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.978721 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.978687 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpc44\" (UniqueName: \"kubernetes.io/projected/6caf0557-ee59-412e-8376-709591b94d3a-kube-api-access-mpc44\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.978875 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.978738 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6caf0557-ee59-412e-8376-709591b94d3a-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.978949 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.978861 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6caf0557-ee59-412e-8376-709591b94d3a-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.978949 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.978918 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6caf0557-ee59-412e-8376-709591b94d3a-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.979043 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:13:00.979005 2581 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-predictor-serving-cert: secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 24 22:13:00.979090 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:13:00.979074 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6caf0557-ee59-412e-8376-709591b94d3a-proxy-tls podName:6caf0557-ee59-412e-8376-709591b94d3a nodeName:}" failed. No retries permitted until 2026-04-24 22:13:01.479053943 +0000 UTC m=+3435.088192918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6caf0557-ee59-412e-8376-709591b94d3a-proxy-tls") pod "isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" (UID: "6caf0557-ee59-412e-8376-709591b94d3a") : secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 24 22:13:00.979133 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.979090 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6caf0557-ee59-412e-8376-709591b94d3a-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.979442 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.979426 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6caf0557-ee59-412e-8376-709591b94d3a-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:00.988763 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:00.988736 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpc44\" (UniqueName: \"kubernetes.io/projected/6caf0557-ee59-412e-8376-709591b94d3a-kube-api-access-mpc44\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:01.482016 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:01.481981 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6caf0557-ee59-412e-8376-709591b94d3a-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:01.484364 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:01.484344 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6caf0557-ee59-412e-8376-709591b94d3a-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:01.740516 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:01.740417 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:01.864275 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:01.864242 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw"] Apr 24 22:13:01.867614 ip-10-0-132-159 kubenswrapper[2581]: W0424 22:13:01.867589 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6caf0557_ee59_412e_8376_709591b94d3a.slice/crio-df2ec2665a0593360def42bb9a83c54a615998a501f05450ff809e1cca177037 WatchSource:0}: Error finding container df2ec2665a0593360def42bb9a83c54a615998a501f05450ff809e1cca177037: Status 404 returned error can't find the container with id df2ec2665a0593360def42bb9a83c54a615998a501f05450ff809e1cca177037 Apr 24 22:13:02.756667 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:02.756631 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" event={"ID":"6caf0557-ee59-412e-8376-709591b94d3a","Type":"ContainerStarted","Data":"f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2"} Apr 24 22:13:02.756667 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:02.756672 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" event={"ID":"6caf0557-ee59-412e-8376-709591b94d3a","Type":"ContainerStarted","Data":"df2ec2665a0593360def42bb9a83c54a615998a501f05450ff809e1cca177037"} Apr 24 22:13:06.769470 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:06.769435 2581 generic.go:358] "Generic (PLEG): container finished" podID="6caf0557-ee59-412e-8376-709591b94d3a" containerID="f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2" exitCode=0 Apr 24 22:13:06.769954 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:06.769519 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" event={"ID":"6caf0557-ee59-412e-8376-709591b94d3a","Type":"ContainerDied","Data":"f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2"} Apr 24 22:13:07.776775 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:07.776738 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" event={"ID":"6caf0557-ee59-412e-8376-709591b94d3a","Type":"ContainerStarted","Data":"2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3"} Apr 24 22:13:07.776775 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:07.776780 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" event={"ID":"6caf0557-ee59-412e-8376-709591b94d3a","Type":"ContainerStarted","Data":"5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca"} Apr 24 22:13:07.777196 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:07.777059 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:07.777196 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:07.777167 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:07.778430 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:07.778406 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 22:13:07.795312 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:07.795267 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podStartSLOduration=7.795254387 podStartE2EDuration="7.795254387s" podCreationTimestamp="2026-04-24 22:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:13:07.794694189 +0000 UTC m=+3441.403833167" watchObservedRunningTime="2026-04-24 22:13:07.795254387 +0000 UTC m=+3441.404393366" Apr 24 22:13:08.779712 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:08.779660 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 22:13:13.783872 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:13.783793 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:13:13.784382 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:13.784353 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 22:13:23.784352 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:23.784309 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 22:13:33.784289 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:33.784249 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 22:13:43.784786 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:43.784743 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 22:13:53.785116 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:13:53.785068 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 22:14:03.784963 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:03.784874 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 22:14:13.784551 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:13.784516 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:14:20.927976 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:20.927936 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw"] Apr 24 22:14:20.928384 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:20.928250 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" containerID="cri-o://5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca" gracePeriod=30 Apr 24 22:14:20.928384 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:20.928300 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kube-rbac-proxy" containerID="cri-o://2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3" gracePeriod=30 Apr 24 22:14:21.983280 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:21.983246 2581 generic.go:358] "Generic (PLEG): container finished" podID="6caf0557-ee59-412e-8376-709591b94d3a" containerID="2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3" exitCode=2 Apr 24 22:14:21.983675 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:21.983297 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" event={"ID":"6caf0557-ee59-412e-8376-709591b94d3a","Type":"ContainerDied","Data":"2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3"} Apr 24 22:14:23.780050 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:23.780000 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 24 22:14:23.784323 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:23.784295 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 22:14:24.567318 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.567297 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:14:24.618359 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.618330 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpc44\" (UniqueName: \"kubernetes.io/projected/6caf0557-ee59-412e-8376-709591b94d3a-kube-api-access-mpc44\") pod \"6caf0557-ee59-412e-8376-709591b94d3a\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " Apr 24 22:14:24.618542 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.618368 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6caf0557-ee59-412e-8376-709591b94d3a-kserve-provision-location\") pod \"6caf0557-ee59-412e-8376-709591b94d3a\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " Apr 24 22:14:24.618542 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.618417 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6caf0557-ee59-412e-8376-709591b94d3a-proxy-tls\") pod \"6caf0557-ee59-412e-8376-709591b94d3a\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " Apr 24 22:14:24.618542 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.618457 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6caf0557-ee59-412e-8376-709591b94d3a-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"6caf0557-ee59-412e-8376-709591b94d3a\" (UID: \"6caf0557-ee59-412e-8376-709591b94d3a\") " Apr 24 22:14:24.618811 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.618718 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6caf0557-ee59-412e-8376-709591b94d3a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6caf0557-ee59-412e-8376-709591b94d3a" (UID: "6caf0557-ee59-412e-8376-709591b94d3a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:14:24.618921 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.618837 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6caf0557-ee59-412e-8376-709591b94d3a-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "6caf0557-ee59-412e-8376-709591b94d3a" (UID: "6caf0557-ee59-412e-8376-709591b94d3a"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:14:24.620590 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.620560 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6caf0557-ee59-412e-8376-709591b94d3a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6caf0557-ee59-412e-8376-709591b94d3a" (UID: "6caf0557-ee59-412e-8376-709591b94d3a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:14:24.620590 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.620583 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6caf0557-ee59-412e-8376-709591b94d3a-kube-api-access-mpc44" (OuterVolumeSpecName: "kube-api-access-mpc44") pod "6caf0557-ee59-412e-8376-709591b94d3a" (UID: "6caf0557-ee59-412e-8376-709591b94d3a"). InnerVolumeSpecName "kube-api-access-mpc44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:14:24.719775 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.719672 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mpc44\" (UniqueName: \"kubernetes.io/projected/6caf0557-ee59-412e-8376-709591b94d3a-kube-api-access-mpc44\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:14:24.719775 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.719709 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6caf0557-ee59-412e-8376-709591b94d3a-kserve-provision-location\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:14:24.719775 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.719724 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6caf0557-ee59-412e-8376-709591b94d3a-proxy-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:14:24.719775 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.719739 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6caf0557-ee59-412e-8376-709591b94d3a-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 24 22:14:24.992079 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.991995 2581 generic.go:358] "Generic (PLEG): container finished" podID="6caf0557-ee59-412e-8376-709591b94d3a" containerID="5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca" exitCode=0 Apr 24 22:14:24.992079 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.992073 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" Apr 24 22:14:24.992578 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.992074 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" event={"ID":"6caf0557-ee59-412e-8376-709591b94d3a","Type":"ContainerDied","Data":"5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca"} Apr 24 22:14:24.992578 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.992111 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw" event={"ID":"6caf0557-ee59-412e-8376-709591b94d3a","Type":"ContainerDied","Data":"df2ec2665a0593360def42bb9a83c54a615998a501f05450ff809e1cca177037"} Apr 24 22:14:24.992578 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.992128 2581 scope.go:117] "RemoveContainer" containerID="2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3" Apr 24 22:14:24.999876 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:24.999858 2581 scope.go:117] "RemoveContainer" containerID="5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca" Apr 24 22:14:25.006417 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:25.006386 2581 scope.go:117] "RemoveContainer" containerID="f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2" Apr 24 22:14:25.009749 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:25.009727 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw"] Apr 24 22:14:25.012361 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:25.012339 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-4cjnw"] Apr 24 22:14:25.013898 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:25.013878 2581 scope.go:117] "RemoveContainer" containerID="2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3" Apr 24 22:14:25.014143 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:14:25.014126 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3\": container with ID starting with 2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3 not found: ID does not exist" containerID="2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3" Apr 24 22:14:25.014187 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:25.014154 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3"} err="failed to get container status \"2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3\": rpc error: code = NotFound desc = could not find container \"2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3\": container with ID starting with 2cc396416452679551c954770f86f01c92e3af2b5eb955286a401aff3d9d90d3 not found: ID does not exist" Apr 24 22:14:25.014187 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:25.014172 2581 scope.go:117] "RemoveContainer" containerID="5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca" Apr 24 22:14:25.014416 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:14:25.014377 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca\": container with ID starting with 5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca not found: ID does not exist" containerID="5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca" Apr 24 22:14:25.014416 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:25.014407 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca"} err="failed to get container status \"5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca\": rpc error: code = NotFound desc = could not find container \"5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca\": container with ID starting with 5ee85ec88c5c3540c407289a103e4f4b544d37b303b70b2eea21eaab6ce10cca not found: ID does not exist" Apr 24 22:14:25.014550 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:25.014420 2581 scope.go:117] "RemoveContainer" containerID="f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2" Apr 24 22:14:25.014615 ip-10-0-132-159 kubenswrapper[2581]: E0424 22:14:25.014599 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2\": container with ID starting with f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2 not found: ID does not exist" containerID="f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2" Apr 24 22:14:25.014655 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:25.014617 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2"} err="failed to get container status \"f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2\": rpc error: code = NotFound desc = could not find container \"f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2\": container with ID starting with f04a7d7077c2fe9d299c915b5677d205856d652ae5f118df769fb25e71ceb3b2 not found: ID does not exist" Apr 24 22:14:26.950183 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:14:26.950142 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6caf0557-ee59-412e-8376-709591b94d3a" path="/var/lib/kubelet/pods/6caf0557-ee59-412e-8376-709591b94d3a/volumes" Apr 24 22:15:47.086472 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:15:47.086446 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:15:47.092241 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:15:47.092221 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:20:29.225462 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:29.225427 2581 ???:1] "http: TLS handshake error from 10.0.134.248:54912: EOF" Apr 24 22:20:29.231164 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:29.231139 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-svh9x_64430b8d-991a-4176-a2b8-b3c3e21f20ba/global-pull-secret-syncer/0.log" Apr 24 22:20:29.388484 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:29.388460 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wx7v9_2aa70b5d-a245-4f50-adbe-ce8e71716842/konnectivity-agent/0.log" Apr 24 22:20:29.415009 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:29.414986 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-159.ec2.internal_baec555e5e2b442b2cad3d99698ce3db/haproxy/0.log" Apr 24 22:20:32.560025 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:32.559990 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sn52d_c603a9c8-437b-42f3-960b-865acebe96ec/node-exporter/0.log" Apr 24 22:20:32.585210 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:32.585182 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sn52d_c603a9c8-437b-42f3-960b-865acebe96ec/kube-rbac-proxy/0.log" Apr 24 22:20:32.609869 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:32.609846 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sn52d_c603a9c8-437b-42f3-960b-865acebe96ec/init-textfile/0.log" Apr 24 22:20:36.362277 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.362239 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg"] Apr 24 22:20:36.362675 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.362517 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kube-rbac-proxy" Apr 24 22:20:36.362675 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.362528 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kube-rbac-proxy" Apr 24 22:20:36.362675 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.362545 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" Apr 24 22:20:36.362675 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.362550 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" Apr 24 22:20:36.362675 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.362559 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="storage-initializer" Apr 24 22:20:36.362675 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.362565 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="storage-initializer" Apr 24 22:20:36.362675 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.362605 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kserve-container" Apr 24 22:20:36.362675 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.362615 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6caf0557-ee59-412e-8376-709591b94d3a" containerName="kube-rbac-proxy" Apr 24 22:20:36.365500 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.365480 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.367645 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.367624 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2zhfh\"/\"kube-root-ca.crt\"" Apr 24 22:20:36.367753 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.367624 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2zhfh\"/\"default-dockercfg-fxrng\"" Apr 24 22:20:36.368313 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.368299 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2zhfh\"/\"openshift-service-ca.crt\"" Apr 24 22:20:36.375975 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.375954 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg"] Apr 24 22:20:36.503124 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.503095 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-proc\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.503124 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.503126 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-lib-modules\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.503356 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.503153 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxrkr\" (UniqueName: \"kubernetes.io/projected/0d3e8a02-695d-432e-93cc-d5eec4dff81e-kube-api-access-rxrkr\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.503356 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.503242 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-sys\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.503356 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.503264 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-podres\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.604603 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.604563 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-sys\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.604603 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.604604 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-podres\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.604851 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.604634 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-proc\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.604851 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.604650 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-lib-modules\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.604851 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.604675 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxrkr\" (UniqueName: \"kubernetes.io/projected/0d3e8a02-695d-432e-93cc-d5eec4dff81e-kube-api-access-rxrkr\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.604851 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.604696 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-sys\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.604851 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.604706 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-proc\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.604851 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.604766 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-podres\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.604851 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.604809 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d3e8a02-695d-432e-93cc-d5eec4dff81e-lib-modules\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.611433 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.611381 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxrkr\" (UniqueName: \"kubernetes.io/projected/0d3e8a02-695d-432e-93cc-d5eec4dff81e-kube-api-access-rxrkr\") pod \"perf-node-gather-daemonset-s9slg\" (UID: \"0d3e8a02-695d-432e-93cc-d5eec4dff81e\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.623517 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.623444 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p55cz_593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c/dns/0.log" Apr 24 22:20:36.642606 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.642579 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p55cz_593945d9-3e3d-4c47-ad8e-c3ce4c6dd77c/kube-rbac-proxy/0.log" Apr 24 22:20:36.675727 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.675699 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:36.684091 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.684067 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tc6g6_a8617564-8813-486b-aaeb-9fd4ef61ca2f/dns-node-resolver/0.log" Apr 24 22:20:36.791265 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.791234 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg"] Apr 24 22:20:36.794351 ip-10-0-132-159 kubenswrapper[2581]: W0424 22:20:36.794327 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0d3e8a02_695d_432e_93cc_d5eec4dff81e.slice/crio-ebe42926746553037292d75b2ccc47d7881259463530955ce951d89008ee386e WatchSource:0}: Error finding container ebe42926746553037292d75b2ccc47d7881259463530955ce951d89008ee386e: Status 404 returned error can't find the container with id ebe42926746553037292d75b2ccc47d7881259463530955ce951d89008ee386e Apr 24 22:20:36.795919 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.795900 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:20:36.993317 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.993283 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" event={"ID":"0d3e8a02-695d-432e-93cc-d5eec4dff81e","Type":"ContainerStarted","Data":"f22ba06b4407bc463a65c1d4fe1bd9dca8ef316d89cc080e4d48c27b3b2647c4"} Apr 24 22:20:36.993317 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.993318 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" event={"ID":"0d3e8a02-695d-432e-93cc-d5eec4dff81e","Type":"ContainerStarted","Data":"ebe42926746553037292d75b2ccc47d7881259463530955ce951d89008ee386e"} Apr 24 22:20:36.993545 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:36.993407 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:37.008274 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:37.008229 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" podStartSLOduration=1.008215605 podStartE2EDuration="1.008215605s" podCreationTimestamp="2026-04-24 22:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:20:37.006436036 +0000 UTC m=+3890.615575037" watchObservedRunningTime="2026-04-24 22:20:37.008215605 +0000 UTC m=+3890.617354584" Apr 24 22:20:37.136225 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:37.136184 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-556844449-86knp_8e37e936-94ab-4cbe-bc5b-157388985568/registry/0.log" Apr 24 22:20:37.164612 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:37.164568 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-flzqj_8f5835d0-33c0-4340-bfe0-67872e19c79e/node-ca/0.log" Apr 24 22:20:38.210864 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:38.210829 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-blv55_83a219f3-ecef-475c-85b4-5e5f89df5b6f/serve-healthcheck-canary/0.log" Apr 24 22:20:38.725835 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:38.725795 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n68s2_2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b/kube-rbac-proxy/0.log" Apr 24 22:20:38.747888 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:38.747865 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n68s2_2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b/exporter/0.log" Apr 24 22:20:38.768267 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:38.768218 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n68s2_2dbcb72e-3727-4f36-94cb-94ae9e9d5b8b/extractor/0.log" Apr 24 22:20:40.737921 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:40.737882 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-6jpsh_39aea3ba-1dc8-464a-a346-3a8d195c2d2d/server/0.log" Apr 24 22:20:41.039377 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:41.039293 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-tgpzm_b8382f57-de8b-494c-9d57-d6a7172d1123/seaweedfs/0.log" Apr 24 22:20:41.064353 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:41.064320 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-67jrc_2dc277b4-d4f3-4907-a8e7-f21637b0faf4/seaweedfs-tls-custom/0.log" Apr 24 22:20:41.087379 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:41.087349 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-qgdgv_f3d898ad-62e4-4f11-a1a6-a8763745ea7f/seaweedfs-tls-serving/0.log" Apr 24 22:20:43.006033 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:43.006005 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-s9slg" Apr 24 22:20:46.201451 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.201423 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmnsd_2a6bc573-bfae-4ef4-a14b-3d3958d53365/kube-multus-additional-cni-plugins/0.log" Apr 24 22:20:46.226901 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.226875 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmnsd_2a6bc573-bfae-4ef4-a14b-3d3958d53365/egress-router-binary-copy/0.log" Apr 24 22:20:46.243854 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.243829 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmnsd_2a6bc573-bfae-4ef4-a14b-3d3958d53365/cni-plugins/0.log" Apr 24 22:20:46.264001 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.263966 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmnsd_2a6bc573-bfae-4ef4-a14b-3d3958d53365/bond-cni-plugin/0.log" Apr 24 22:20:46.283636 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.283605 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmnsd_2a6bc573-bfae-4ef4-a14b-3d3958d53365/routeoverride-cni/0.log" Apr 24 22:20:46.303532 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.303511 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmnsd_2a6bc573-bfae-4ef4-a14b-3d3958d53365/whereabouts-cni-bincopy/0.log" Apr 24 22:20:46.323847 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.323821 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qmnsd_2a6bc573-bfae-4ef4-a14b-3d3958d53365/whereabouts-cni/0.log" Apr 24 22:20:46.357704 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.357665 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8xkn_85b37092-0856-40bf-ad2e-32b72caa332b/kube-multus/0.log" Apr 24 22:20:46.501368 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.501259 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mf254_ab61ba5a-75b0-4d88-af4b-3e98166b3f50/network-metrics-daemon/0.log" Apr 24 22:20:46.522663 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:46.522634 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mf254_ab61ba5a-75b0-4d88-af4b-3e98166b3f50/kube-rbac-proxy/0.log" Apr 24 22:20:47.104964 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.104934 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:20:47.111409 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.111372 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:20:47.640779 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.640747 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-controller/0.log" Apr 24 22:20:47.658514 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.658478 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/0.log" Apr 24 22:20:47.693609 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.693584 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovn-acl-logging/1.log" Apr 24 22:20:47.717193 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.717157 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/kube-rbac-proxy-node/0.log" Apr 24 22:20:47.740727 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.740697 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:20:47.758197 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.758166 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/northd/0.log" Apr 24 22:20:47.776763 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.776737 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/nbdb/0.log" Apr 24 22:20:47.794832 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.794808 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/sbdb/0.log" Apr 24 22:20:47.952221 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:47.952144 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzbzn_4c1d5671-39e8-4826-af5d-f49631e0ece2/ovnkube-controller/0.log" Apr 24 22:20:49.147624 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:49.147593 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-w6vsz_0d188a02-fe28-4c44-96ea-c22a4f133693/network-check-target-container/0.log" Apr 24 22:20:50.093798 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:50.093766 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qrkcn_20f418c6-9af5-4d97-ac8d-065d25b5b429/iptables-alerter/0.log" Apr 24 22:20:50.757626 ip-10-0-132-159 kubenswrapper[2581]: I0424 22:20:50.757595 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-stjmc_cff831a4-3dde-4185-b6ac-264f7592353a/tuned/0.log"