Apr 23 17:38:24.305115 ip-10-0-138-17 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 17:38:24.305129 ip-10-0-138-17 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 17:38:24.305138 ip-10-0-138-17 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 17:38:24.305461 ip-10-0-138-17 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 17:38:34.469179 ip-10-0-138-17 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 17:38:34.469197 ip-10-0-138-17 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 587c3db181934a2aaad3114fad0514c0 -- Apr 23 17:41:03.809081 ip-10-0-138-17 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:41:04.228342 ip-10-0-138-17 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:04.228342 ip-10-0-138-17 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:41:04.228342 ip-10-0-138-17 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:04.228342 ip-10-0-138-17 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:41:04.228342 ip-10-0-138-17 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:04.230625 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.230533 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:41:04.237874 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237853 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:04.237874 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237871 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:04.237874 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237875 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237878 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237882 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237885 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237888 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237891 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237893 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237896 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237899 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237903 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237906 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237909 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237911 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237915 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237918 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237920 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237923 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237925 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237928 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:04.237968 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237931 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237934 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237936 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237940 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237942 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237945 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237948 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237951 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237953 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237956 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237958 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237961 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237964 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237966 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237971 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237975 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237979 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237982 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237984 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237987 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:04.238415 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237990 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237993 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237996 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.237998 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238001 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238003 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238006 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238008 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238011 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238013 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238016 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238018 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238021 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238024 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238027 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238029 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238032 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238035 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238037 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238040 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:04.238901 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238043 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238046 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238048 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238051 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238053 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238056 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238058 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238061 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238063 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238066 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238069 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238071 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238074 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238077 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238079 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238082 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238084 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238087 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238091 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238093 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:04.239418 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238096 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238098 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238101 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238105 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238109 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238561 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238566 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238570 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238573 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238576 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238579 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238581 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238584 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238587 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238590 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238592 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238595 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238597 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238601 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:04.239911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238604 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238607 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238615 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238618 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238621 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238623 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238626 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238628 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238631 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238634 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238637 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238639 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238642 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238645 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238647 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238650 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238652 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238655 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238657 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238661 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:04.240358 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238663 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238666 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238669 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238671 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238675 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238677 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238680 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238684 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238687 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238690 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238693 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238696 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238699 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238701 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238704 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238712 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238715 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238718 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238720 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238723 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:04.240877 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238725 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238728 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238730 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238732 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238735 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238737 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238740 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238742 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238745 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238748 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238750 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238753 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238755 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238758 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238761 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238763 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238783 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238787 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238791 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238795 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:04.241364 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238799 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238804 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238806 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238809 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238812 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238814 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238817 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238820 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238823 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238826 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238828 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.238831 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240274 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240289 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240297 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240302 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240307 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240311 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240315 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240320 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240323 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:41:04.241872 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240326 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240330 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240334 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240338 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240341 2571 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240344 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240347 2571 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240350 2571 flags.go:64] FLAG: --cloud-config="" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240353 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240356 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240360 2571 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240363 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240366 2571 flags.go:64] FLAG: --config-dir="" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240370 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240373 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240377 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240380 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240383 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240387 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240390 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240393 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240396 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240400 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240404 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240408 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:41:04.242373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240411 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240415 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240418 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240421 2571 flags.go:64] FLAG: --enable-server="true" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240424 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240429 2571 flags.go:64] FLAG: --event-burst="100" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240433 2571 flags.go:64] FLAG: --event-qps="50" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240436 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240439 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240442 2571 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240446 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240449 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240452 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240455 2571 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240458 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240461 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240464 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240467 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240470 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240473 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240476 2571 flags.go:64] FLAG: --feature-gates="" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240480 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240483 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240486 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240489 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240492 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:41:04.242994 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240495 2571 flags.go:64] FLAG: --help="false" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240498 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240502 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240505 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240508 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240512 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240515 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240518 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240521 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240524 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240527 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240530 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240534 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240536 2571 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240540 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240543 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240546 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240549 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240551 2571 flags.go:64] FLAG: --lock-file="" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240554 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240557 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240560 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240566 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:41:04.243634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240569 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240571 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240574 2571 flags.go:64] FLAG: --logging-format="text" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240578 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240581 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240584 2571 flags.go:64] FLAG: --manifest-url="" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240587 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240592 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240595 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240599 2571 flags.go:64] FLAG: --max-pods="110" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240602 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240605 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240608 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240611 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240615 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240617 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240620 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240629 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240632 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240635 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240638 2571 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240640 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240647 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240650 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:41:04.244191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240653 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240656 2571 flags.go:64] FLAG: --port="10250" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240659 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240662 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00f3277c0c6d84b21" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240665 2571 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240668 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240671 2571 flags.go:64] FLAG: --register-node="true" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240673 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240676 2571 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240681 2571 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240683 2571 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240686 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240689 2571 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240693 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240696 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240699 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240702 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240705 2571 flags.go:64] FLAG: --runonce="false" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240707 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240710 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240713 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240717 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240720 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240723 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240726 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240729 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:41:04.244782 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240732 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240735 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240738 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240741 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240744 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240747 2571 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240750 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240755 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240758 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240761 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240776 2571 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240779 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240782 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240785 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240788 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240791 2571 flags.go:64] FLAG: --v="2" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240795 2571 flags.go:64] FLAG: --version="false" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240802 2571 flags.go:64] FLAG: --vmodule="" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240807 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.240810 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240934 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240939 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240942 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240945 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:04.245452 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240947 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240950 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240953 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240956 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240958 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240961 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240963 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240966 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240969 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240971 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240974 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240977 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240979 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240982 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240984 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240990 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240993 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240995 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.240997 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:04.246064 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241000 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241003 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241005 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241008 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241010 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241013 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241015 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241017 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241020 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241022 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241025 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241027 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241030 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241032 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241035 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241037 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241039 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241043 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241048 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241051 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:04.246565 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241053 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241056 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241058 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241061 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241063 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241066 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241068 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241071 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241075 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241078 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241080 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241082 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241085 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241087 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241090 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241092 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241095 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241097 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241100 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241102 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:04.247242 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241105 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241107 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241110 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241112 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241115 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241117 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241121 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241125 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241129 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241132 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241135 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241138 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241140 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241143 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241146 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241149 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241152 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241155 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241158 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241160 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:04.248096 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241165 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:04.248927 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241168 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:04.248927 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.241170 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:04.248927 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.241761 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:04.248927 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.248902 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:41:04.248927 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.248923 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249005 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249015 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249023 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249028 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249032 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249037 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249041 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249047 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249051 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249056 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249060 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249064 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249069 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249073 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249077 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249081 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249086 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249090 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:04.249169 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249094 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249099 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249103 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249107 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249111 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249115 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249119 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249123 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249127 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249131 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249135 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249139 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249145 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249159 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249165 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249170 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249175 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249179 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249184 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249188 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:04.250020 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249191 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249195 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249199 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249204 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249208 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249213 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249217 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249221 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249224 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249229 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249232 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249245 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249250 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249254 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249258 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249262 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249266 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249270 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249274 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249278 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:04.250631 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249282 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249285 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249289 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249293 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249297 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249301 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249315 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249320 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249324 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249328 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249332 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249335 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249340 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249344 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249348 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249352 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249356 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249361 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249365 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:04.251346 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249369 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249373 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249377 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249381 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249385 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249389 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249393 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249397 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249401 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.249409 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249637 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249646 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249651 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249655 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249659 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:04.252202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249663 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249668 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249672 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249676 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249680 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249692 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249697 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249701 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249705 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249709 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249713 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249717 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249722 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249726 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249730 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249734 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249738 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249741 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249745 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249749 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:04.252864 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249753 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249757 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249761 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249783 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249788 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249791 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249795 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249799 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249803 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249807 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249812 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249816 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249820 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249824 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249828 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249831 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249835 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249839 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249854 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249859 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:04.253407 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249863 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249867 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249871 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249875 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249879 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249883 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249887 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249891 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249896 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249900 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249904 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249908 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249915 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249921 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249926 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249932 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249936 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249941 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249945 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:04.254024 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249949 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249961 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249965 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249969 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249974 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249978 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249982 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249986 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249990 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249994 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.249999 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250004 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250015 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250019 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250023 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250027 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250031 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250035 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250038 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250043 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:04.254495 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250047 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:04.255037 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:04.250051 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:04.255037 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.250059 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:04.255037 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.250796 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:41:04.255037 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.253428 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:41:04.255037 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.254620 2571 server.go:1019] "Starting client certificate rotation" Apr 23 17:41:04.255037 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.254721 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:41:04.255536 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.255523 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:41:04.283852 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.283823 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:41:04.286514 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.286491 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:41:04.299638 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.299615 2571 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:41:04.305124 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.305103 2571 log.go:25] "Validated CRI v1 image API" Apr 23 17:41:04.307021 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.306997 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:41:04.310815 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.310786 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:41:04.311048 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.311030 2571 fs.go:135] Filesystem UUIDs: map[2a4be2db-29ae-44df-9a3b-2d3b82529af9:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c965c7ff-8715-4de8-8105-9f24dbdd8e1e:/dev/nvme0n1p4] Apr 23 17:41:04.311096 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.311050 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:41:04.317130 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.317007 2571 manager.go:217] Machine: {Timestamp:2026-04-23 17:41:04.315860457 +0000 UTC m=+0.394432425 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100098 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23ba3e4cb07ebbc068dad724acb9fd SystemUUID:ec23ba3e-4cb0-7ebb-c068-dad724acb9fd BootID:587c3db1-8193-4a2a-aad3-114fad0514c0 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b7:3e:48:09:0f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b7:3e:48:09:0f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:76:56:f3:41:07 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:41:04.317130 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.317121 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:41:04.317273 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.317207 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:41:04.319587 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.319557 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:41:04.319733 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.319588 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-17.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:41:04.319807 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.319742 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:41:04.319807 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.319751 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:41:04.319807 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.319779 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:41:04.319807 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.319797 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:41:04.320999 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.320988 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:41:04.321113 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.321104 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:41:04.323297 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.323286 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:41:04.323329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.323314 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:41:04.323754 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.323745 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:41:04.323808 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.323763 2571 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:41:04.323808 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.323791 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:41:04.324880 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.324868 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:41:04.324941 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.324886 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:41:04.332152 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.332130 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:41:04.333448 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.333426 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:41:04.334998 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.334986 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:41:04.335054 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335002 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:41:04.335054 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335009 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:41:04.335054 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335015 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:41:04.335054 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335021 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:41:04.335054 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335027 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:41:04.335054 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335033 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:41:04.335054 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335039 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:41:04.335054 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335046 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:41:04.335054 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335053 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:41:04.335321 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335061 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:41:04.335321 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335070 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:41:04.335944 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335934 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:41:04.335944 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.335945 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:41:04.338286 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.338262 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:41:04.338286 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.338265 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-17.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:41:04.339470 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.339455 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:41:04.339533 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.339496 2571 server.go:1295] "Started kubelet" Apr 23 17:41:04.339583 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.339540 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:41:04.340255 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.340207 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:41:04.340323 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.340269 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:41:04.340331 ip-10-0-138-17 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:41:04.341670 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.341654 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:41:04.342269 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.342248 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:41:04.346218 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.346196 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-17.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:41:04.346459 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.346441 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:41:04.346551 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.346467 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:41:04.347160 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.347142 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:41:04.347160 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.347163 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:41:04.347321 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.346289 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-17.ec2.internal.18a90d3882e0c920 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-17.ec2.internal,UID:ip-10-0-138-17.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-17.ec2.internal,},FirstTimestamp:2026-04-23 17:41:04.339470624 +0000 UTC m=+0.418042596,LastTimestamp:2026-04-23 17:41:04.339470624 +0000 UTC m=+0.418042596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-17.ec2.internal,}" Apr 23 17:41:04.347321 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.347243 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:41:04.347321 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.347294 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:04.347321 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.347315 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:41:04.347321 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.347324 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:41:04.347536 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.347387 2571 factory.go:55] Registering systemd factory Apr 23 17:41:04.347536 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.347408 2571 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:41:04.347984 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.347969 2571 factory.go:153] Registering CRI-O factory Apr 23 17:41:04.348069 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.347989 2571 factory.go:223] Registration of the crio container factory successfully Apr 23 17:41:04.348069 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.348055 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:41:04.348169 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.348080 2571 factory.go:103] Registering Raw factory Apr 23 17:41:04.348169 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.348094 2571 manager.go:1196] Started watching for new ooms in manager Apr 23 17:41:04.348898 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.348879 2571 manager.go:319] Starting recovery of all containers Apr 23 17:41:04.350555 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.350507 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:41:04.357363 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.357332 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-17.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 17:41:04.357488 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.357464 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:41:04.363369 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.363251 2571 manager.go:324] Recovery completed Apr 23 17:41:04.367458 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.367445 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:04.368272 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.368256 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-46bk2" Apr 23 17:41:04.369704 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.369689 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:04.369751 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.369720 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:04.369751 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.369730 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:04.370341 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.370326 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:41:04.370341 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.370341 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:41:04.370421 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.370356 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:41:04.371466 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.371392 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-17.ec2.internal.18a90d3884ae2522 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-17.ec2.internal,UID:ip-10-0-138-17.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-17.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-17.ec2.internal,},FirstTimestamp:2026-04-23 17:41:04.369706274 +0000 UTC m=+0.448278244,LastTimestamp:2026-04-23 17:41:04.369706274 +0000 UTC m=+0.448278244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-17.ec2.internal,}" Apr 23 17:41:04.373387 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.373375 2571 policy_none.go:49] "None policy: Start" Apr 23 17:41:04.373422 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.373392 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:41:04.373422 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.373402 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:41:04.376990 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.376970 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-46bk2" Apr 23 17:41:04.410396 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.410380 2571 manager.go:341] "Starting Device Plugin manager" Apr 23 17:41:04.436542 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.410413 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:41:04.436542 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.410423 2571 server.go:85] "Starting device plugin registration server" Apr 23 17:41:04.436542 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.410698 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:41:04.436542 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.410709 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:41:04.436542 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.410822 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:41:04.436542 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.410957 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:41:04.436542 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.410967 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:41:04.436542 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.411482 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:41:04.436542 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.411525 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:04.486146 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.486057 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:41:04.487484 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.487452 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:41:04.487484 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.487481 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:41:04.487654 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.487503 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:41:04.487654 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.487509 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:41:04.487654 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.487550 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:41:04.490853 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.490829 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:04.511250 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.511217 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:04.512113 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.512100 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:04.512176 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.512129 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:04.512176 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.512145 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:04.512176 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.512175 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.521705 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.521687 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.521760 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.521711 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-17.ec2.internal\": node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:04.536856 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.536838 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:04.588213 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.588182 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal"] Apr 23 17:41:04.588299 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.588260 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:04.589828 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.589813 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:04.589901 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.589845 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:04.589901 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.589860 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:04.592028 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.592015 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:04.592170 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.592157 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.592208 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.592185 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:04.592700 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.592680 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:04.592826 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.592713 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:04.592826 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.592724 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:04.592826 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.592689 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:04.592826 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.592816 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:04.592826 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.592829 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:04.594844 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.594829 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.594912 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.594857 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:04.595468 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.595454 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:04.595535 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.595482 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:04.595535 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.595492 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:04.621593 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.621574 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-17.ec2.internal\" not found" node="ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.626130 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.626113 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-17.ec2.internal\" not found" node="ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.636928 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.636911 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:04.648730 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.648703 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/40e393938119b34b55859b23c603acc7-config\") pod \"kube-apiserver-proxy-ip-10-0-138-17.ec2.internal\" (UID: \"40e393938119b34b55859b23c603acc7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.648799 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.648736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4c5ddc418bd1d17b7357c5c1274f08d1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal\" (UID: \"4c5ddc418bd1d17b7357c5c1274f08d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.648799 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.648755 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c5ddc418bd1d17b7357c5c1274f08d1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal\" (UID: \"4c5ddc418bd1d17b7357c5c1274f08d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.737886 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.737799 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:04.749180 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.749155 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4c5ddc418bd1d17b7357c5c1274f08d1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal\" (UID: \"4c5ddc418bd1d17b7357c5c1274f08d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.749231 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.749186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c5ddc418bd1d17b7357c5c1274f08d1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal\" (UID: \"4c5ddc418bd1d17b7357c5c1274f08d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.749231 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.749204 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/40e393938119b34b55859b23c603acc7-config\") pod \"kube-apiserver-proxy-ip-10-0-138-17.ec2.internal\" (UID: \"40e393938119b34b55859b23c603acc7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.749301 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.749238 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c5ddc418bd1d17b7357c5c1274f08d1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal\" (UID: \"4c5ddc418bd1d17b7357c5c1274f08d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.749301 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.749247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4c5ddc418bd1d17b7357c5c1274f08d1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal\" (UID: \"4c5ddc418bd1d17b7357c5c1274f08d1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.749301 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.749285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/40e393938119b34b55859b23c603acc7-config\") pod \"kube-apiserver-proxy-ip-10-0-138-17.ec2.internal\" (UID: \"40e393938119b34b55859b23c603acc7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.838612 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.838571 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:04.923092 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.923062 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.928748 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:04.928726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal" Apr 23 17:41:04.939375 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:04.939352 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:05.039955 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.039907 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:05.140563 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.140514 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-17.ec2.internal\" not found" Apr 23 17:41:05.182794 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.182753 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:05.246982 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.246950 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:05.254722 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.254700 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:41:05.254881 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.254859 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:41:05.254932 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.254854 2571 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://ae3f1098939394c93a07b9a9adc2f11b-d0393709c08041ad.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.138.17:36296->34.224.35.179:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:05.254932 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.254897 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal" Apr 23 17:41:05.255016 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.254859 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:41:05.274066 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.274034 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:41:05.324713 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.324643 2571 apiserver.go:52] "Watching apiserver" Apr 23 17:41:05.331706 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.331680 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:41:05.332005 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.331983 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-7tx9v","openshift-dns/node-resolver-cxr7m","openshift-multus/multus-4xrxl","openshift-multus/multus-additional-cni-plugins-t79h9","openshift-network-diagnostics/network-check-target-8rrlh","openshift-network-operator/iptables-alerter-spvtt","openshift-ovn-kubernetes/ovnkube-node-cqkbt","kube-system/konnectivity-agent-79jpg","kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal","openshift-image-registry/node-ca-sdzf6","openshift-multus/network-metrics-daemon-qzv7h","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8"] Apr 23 17:41:05.336861 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.336843 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.339190 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.339169 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:41:05.339190 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.339180 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-q77pz\"" Apr 23 17:41:05.339324 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.339171 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:41:05.339519 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.339503 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.339631 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.339565 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.341238 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.341217 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-z64c7\"" Apr 23 17:41:05.341385 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.341256 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:41:05.341385 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.341329 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-2gjvj\"" Apr 23 17:41:05.341499 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.341411 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:41:05.341561 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.341497 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:41:05.341561 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.341524 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:41:05.341561 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.341525 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:41:05.341733 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.341499 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:41:05.342426 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.342383 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.343983 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.343967 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rjjlh\"" Apr 23 17:41:05.344165 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.344154 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:41:05.344216 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.344157 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:41:05.344784 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.344758 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:05.344853 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.344835 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:05.346540 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.346525 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:41:05.347327 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.347311 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.348849 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.348833 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:41:05.349045 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.349033 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:41:05.349247 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.349232 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fkl2c\"" Apr 23 17:41:05.349319 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.349305 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:41:05.349634 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.349615 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.351373 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.351355 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:41:05.351457 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.351445 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:41:05.351669 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.351651 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:41:05.352068 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.351739 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:41:05.352068 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.351846 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:05.352068 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.351932 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dxlhg\"" Apr 23 17:41:05.352068 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.351952 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:41:05.352068 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.351937 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:41:05.352526 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-kubernetes\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.352632 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352555 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-var-lib-kubelet\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.352632 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67249\" (UniqueName: \"kubernetes.io/projected/ab714fa9-cc35-46b7-9076-b5623bd67831-kube-api-access-67249\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-log-socket\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352685 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-tuned\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/adddecdf-d1af-4726-baac-6b7ff1828f40-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352754 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352873 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-os-release\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352918 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33946126-5bdc-4047-b09e-8ca68acdbd65-hosts-file\") pod \"node-resolver-cxr7m\" (UID: \"33946126-5bdc-4047-b09e-8ca68acdbd65\") " pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352951 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-node-log\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.352986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-cni-netd\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353020 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6r7s\" (UniqueName: \"kubernetes.io/projected/de5b0280-3d7b-48bb-b05d-befd13392325-kube-api-access-h6r7s\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-systemd\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353167 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/adddecdf-d1af-4726-baac-6b7ff1828f40-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-host\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353284 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-system-cni-dir\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.353423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353360 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-run-netns\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353436 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-var-lib-cni-multus\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353512 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-slash\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353562 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-run-netns\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353624 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-run-systemd\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33946126-5bdc-4047-b09e-8ca68acdbd65-tmp-dir\") pod \"node-resolver-cxr7m\" (UID: \"33946126-5bdc-4047-b09e-8ca68acdbd65\") " pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353791 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-run-ovn-kubernetes\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353836 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-modprobe-d\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-sysconfig\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-sysctl-conf\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353920 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ck86f\"" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.353961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-sys\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354002 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-systemd-units\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354056 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354068 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-var-lib-cni-bin\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.354110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354100 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f79f8f5-4fe2-436d-82d4-a94a0d58ba45-iptables-alerter-script\") pod \"iptables-alerter-spvtt\" (UID: \"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45\") " pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsvv5\" (UniqueName: \"kubernetes.io/projected/0f79f8f5-4fe2-436d-82d4-a94a0d58ba45-kube-api-access-zsvv5\") pod \"iptables-alerter-spvtt\" (UID: \"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45\") " pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354166 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/de5b0280-3d7b-48bb-b05d-befd13392325-ovnkube-config\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354188 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/de5b0280-3d7b-48bb-b05d-befd13392325-ovnkube-script-lib\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fz5\" (UniqueName: \"kubernetes.io/projected/4e3268bc-024b-451b-9f80-42d8dd401c8e-kube-api-access-r2fz5\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-cnibin\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354309 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/adddecdf-d1af-4726-baac-6b7ff1828f40-cni-binary-copy\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr8m6\" (UniqueName: \"kubernetes.io/projected/adddecdf-d1af-4726-baac-6b7ff1828f40-kube-api-access-gr8m6\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ab714fa9-cc35-46b7-9076-b5623bd67831-cni-binary-copy\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354413 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-socket-dir-parent\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354442 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-var-lib-kubelet\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354465 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-kubelet\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354494 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-sysctl-d\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354522 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-run-multus-certs\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354550 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-run-ovn\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rlvd\" (UniqueName: \"kubernetes.io/projected/33946126-5bdc-4047-b09e-8ca68acdbd65-kube-api-access-2rlvd\") pod \"node-resolver-cxr7m\" (UID: \"33946126-5bdc-4047-b09e-8ca68acdbd65\") " pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.354900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-run\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354639 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-run-k8s-cni-cncf-io\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354668 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-etc-kubernetes\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354695 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f79f8f5-4fe2-436d-82d4-a94a0d58ba45-host-slash\") pod \"iptables-alerter-spvtt\" (UID: \"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45\") " pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354723 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/de5b0280-3d7b-48bb-b05d-befd13392325-env-overrides\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354785 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-cnibin\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354815 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-conf-dir\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-etc-openvswitch\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-system-cni-dir\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.354978 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-lib-modules\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-hostroot\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-var-lib-openvswitch\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-run-openvswitch\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/de5b0280-3d7b-48bb-b05d-befd13392325-ovn-node-metrics-cert\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-os-release\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355258 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.355603 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355289 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-cni-bin\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.356331 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355316 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-daemon-config\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.356331 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.356331 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355882 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4e3268bc-024b-451b-9f80-42d8dd401c8e-tmp\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.356331 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.355917 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-cni-dir\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.357647 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.357618 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:41:05.357745 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.357707 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:41:05.357745 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.357731 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:41:05.357889 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.357861 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8n4q4\"" Apr 23 17:41:05.358498 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.358478 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:05.358586 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.358565 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:05.358977 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.358959 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:41:05.360694 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.360676 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.363050 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.363028 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:41:05.363158 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.363071 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:41:05.363158 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.363108 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:41:05.363370 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.363352 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vqfn4\"" Apr 23 17:41:05.379401 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.379372 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:36:04 +0000 UTC" deadline="2028-01-02 13:40:27.569196446 +0000 UTC" Apr 23 17:41:05.379401 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.379399 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14851h59m22.189799609s" Apr 23 17:41:05.389566 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.389545 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dzbd8" Apr 23 17:41:05.401541 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.401506 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dzbd8" Apr 23 17:41:05.448365 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.448347 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:41:05.456648 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456624 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rlvd\" (UniqueName: \"kubernetes.io/projected/33946126-5bdc-4047-b09e-8ca68acdbd65-kube-api-access-2rlvd\") pod \"node-resolver-cxr7m\" (UID: \"33946126-5bdc-4047-b09e-8ca68acdbd65\") " pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.456761 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-run\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.456761 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-run-k8s-cni-cncf-io\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.456761 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-etc-kubernetes\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.456761 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456708 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-sys-fs\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.456761 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f79f8f5-4fe2-436d-82d4-a94a0d58ba45-host-slash\") pod \"iptables-alerter-spvtt\" (UID: \"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45\") " pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.456761 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456752 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/de5b0280-3d7b-48bb-b05d-befd13392325-env-overrides\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456755 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-run\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456755 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-run-k8s-cni-cncf-io\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456789 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-etc-kubernetes\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-cnibin\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-cnibin\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456866 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-conf-dir\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456922 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-etc-openvswitch\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456967 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-conf-dir\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456969 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-etc-openvswitch\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457010 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-system-cni-dir\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.457056 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-lib-modules\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.457573 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457087 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-hostroot\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.457573 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457118 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-var-lib-openvswitch\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.457573 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457116 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-system-cni-dir\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.457573 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-run-openvswitch\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.457573 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/de5b0280-3d7b-48bb-b05d-befd13392325-ovn-node-metrics-cert\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.457573 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-hostroot\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.457573 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457210 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-os-release\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.457573 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.457926 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-cni-bin\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.457926 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457725 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-lib-modules\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.457926 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457806 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-daemon-config\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.457926 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/929358d6-5b4a-49e4-a824-b3fbdf245db8-agent-certs\") pod \"konnectivity-agent-79jpg\" (UID: \"929358d6-5b4a-49e4-a824-b3fbdf245db8\") " pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:05.457926 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz5n2\" (UniqueName: \"kubernetes.io/projected/71c7aa65-427a-49ff-a818-134bb8e8549d-kube-api-access-dz5n2\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.457926 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4e3268bc-024b-451b-9f80-42d8dd401c8e-tmp\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-cni-bin\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457951 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-cni-dir\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.457994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458031 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gl7d\" (UniqueName: \"kubernetes.io/projected/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-kube-api-access-7gl7d\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458063 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-registration-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-device-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-cni-dir\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458127 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-kubernetes\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.456905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f79f8f5-4fe2-436d-82d4-a94a0d58ba45-host-slash\") pod \"iptables-alerter-spvtt\" (UID: \"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45\") " pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458154 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-var-lib-kubelet\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.458196 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458185 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67249\" (UniqueName: \"kubernetes.io/projected/ab714fa9-cc35-46b7-9076-b5623bd67831-kube-api-access-67249\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.458675 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458237 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/de5b0280-3d7b-48bb-b05d-befd13392325-env-overrides\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.458675 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-log-socket\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.458675 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-var-lib-openvswitch\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.458675 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458391 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-kubernetes\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.458675 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458444 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-var-lib-kubelet\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.458675 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458451 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-run-openvswitch\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.458974 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458741 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-daemon-config\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.458974 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458887 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:41:05.458974 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-os-release\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.458217 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-log-socket\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459234 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-tuned\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/adddecdf-d1af-4726-baac-6b7ff1828f40-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459361 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f-host\") pod \"node-ca-sdzf6\" (UID: \"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f\") " pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459393 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-socket-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-os-release\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459450 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33946126-5bdc-4047-b09e-8ca68acdbd65-hosts-file\") pod \"node-resolver-cxr7m\" (UID: \"33946126-5bdc-4047-b09e-8ca68acdbd65\") " pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-node-log\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-cni-netd\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6r7s\" (UniqueName: \"kubernetes.io/projected/de5b0280-3d7b-48bb-b05d-befd13392325-kube-api-access-h6r7s\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-systemd\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/adddecdf-d1af-4726-baac-6b7ff1828f40-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-host\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-system-cni-dir\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-run-netns\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.460806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459707 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-var-lib-cni-multus\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459724 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-node-log\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459736 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-slash\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-slash\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-run-netns\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-run-systemd\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.459964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/adddecdf-d1af-4726-baac-6b7ff1828f40-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460015 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460118 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33946126-5bdc-4047-b09e-8ca68acdbd65-tmp-dir\") pod \"node-resolver-cxr7m\" (UID: \"33946126-5bdc-4047-b09e-8ca68acdbd65\") " pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-run-ovn-kubernetes\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-modprobe-d\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-sysconfig\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-sysctl-conf\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460285 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-sys\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460301 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-cni-netd\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-systemd-units\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460362 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-var-lib-cni-bin\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.461615 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/929358d6-5b4a-49e4-a824-b3fbdf245db8-konnectivity-ca\") pod \"konnectivity-agent-79jpg\" (UID: \"929358d6-5b4a-49e4-a824-b3fbdf245db8\") " pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:05.462368 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f79f8f5-4fe2-436d-82d4-a94a0d58ba45-iptables-alerter-script\") pod \"iptables-alerter-spvtt\" (UID: \"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45\") " pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.462368 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsvv5\" (UniqueName: \"kubernetes.io/projected/0f79f8f5-4fe2-436d-82d4-a94a0d58ba45-kube-api-access-zsvv5\") pod \"iptables-alerter-spvtt\" (UID: \"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45\") " pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.462368 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460494 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/de5b0280-3d7b-48bb-b05d-befd13392325-ovnkube-config\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.462368 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/de5b0280-3d7b-48bb-b05d-befd13392325-ovnkube-script-lib\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.462368 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460553 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fz5\" (UniqueName: \"kubernetes.io/projected/4e3268bc-024b-451b-9f80-42d8dd401c8e-kube-api-access-r2fz5\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.462368 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-cnibin\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.462368 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460599 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-systemd\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.462368 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.460615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/adddecdf-d1af-4726-baac-6b7ff1828f40-cni-binary-copy\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.462725 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462680 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-tuned\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.462801 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4e3268bc-024b-451b-9f80-42d8dd401c8e-tmp\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.462858 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-run-systemd\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.462909 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462864 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/adddecdf-d1af-4726-baac-6b7ff1828f40-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.462909 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-host\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.463071 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr8m6\" (UniqueName: \"kubernetes.io/projected/adddecdf-d1af-4726-baac-6b7ff1828f40-kube-api-access-gr8m6\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.463071 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462949 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-system-cni-dir\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.463071 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ab714fa9-cc35-46b7-9076-b5623bd67831-cni-binary-copy\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.463071 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-socket-dir-parent\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.463071 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.462999 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-run-netns\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.463071 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463031 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-var-lib-kubelet\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.463071 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463056 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-var-lib-cni-multus\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.463071 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f-serviceca\") pod \"node-ca-sdzf6\" (UID: \"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f\") " pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.463429 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-run-netns\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.463429 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6gl\" (UniqueName: \"kubernetes.io/projected/7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f-kube-api-access-zv6gl\") pod \"node-ca-sdzf6\" (UID: \"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f\") " pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.463429 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-etc-selinux\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.463429 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-kubelet\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.463429 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463222 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-sysctl-d\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.463429 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-run-multus-certs\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.463429 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-run-ovn\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.463429 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463395 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33946126-5bdc-4047-b09e-8ca68acdbd65-tmp-dir\") pod \"node-resolver-cxr7m\" (UID: \"33946126-5bdc-4047-b09e-8ca68acdbd65\") " pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.463429 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463406 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-run-ovn\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.463864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-run-ovn-kubernetes\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.463864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-run-multus-certs\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.463864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463576 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-modprobe-d\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.463864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463608 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-sysctl-d\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.463864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-sysconfig\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.463864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463668 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-var-lib-cni-bin\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.464127 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-os-release\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.464127 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464000 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33946126-5bdc-4047-b09e-8ca68acdbd65-hosts-file\") pod \"node-resolver-cxr7m\" (UID: \"33946126-5bdc-4047-b09e-8ca68acdbd65\") " pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.464127 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464112 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-etc-sysctl-conf\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.464258 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4e3268bc-024b-451b-9f80-42d8dd401c8e-sys\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.464258 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-systemd-units\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.464258 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464252 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.464453 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f79f8f5-4fe2-436d-82d4-a94a0d58ba45-iptables-alerter-script\") pod \"iptables-alerter-spvtt\" (UID: \"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45\") " pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.464996 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464524 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/adddecdf-d1af-4726-baac-6b7ff1828f40-cnibin\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.464996 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464627 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ab714fa9-cc35-46b7-9076-b5623bd67831-cni-binary-copy\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.464996 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464703 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-multus-socket-dir-parent\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.464996 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464741 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab714fa9-cc35-46b7-9076-b5623bd67831-host-var-lib-kubelet\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.464996 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464801 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.464996 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.463477 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/de5b0280-3d7b-48bb-b05d-befd13392325-host-kubelet\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.464996 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/de5b0280-3d7b-48bb-b05d-befd13392325-ovnkube-script-lib\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.464996 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.464859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/adddecdf-d1af-4726-baac-6b7ff1828f40-cni-binary-copy\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.465446 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.465216 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/de5b0280-3d7b-48bb-b05d-befd13392325-ovn-node-metrics-cert\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.465446 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.465247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/de5b0280-3d7b-48bb-b05d-befd13392325-ovnkube-config\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.466075 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.466048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rlvd\" (UniqueName: \"kubernetes.io/projected/33946126-5bdc-4047-b09e-8ca68acdbd65-kube-api-access-2rlvd\") pod \"node-resolver-cxr7m\" (UID: \"33946126-5bdc-4047-b09e-8ca68acdbd65\") " pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.466251 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.466229 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67249\" (UniqueName: \"kubernetes.io/projected/ab714fa9-cc35-46b7-9076-b5623bd67831-kube-api-access-67249\") pod \"multus-4xrxl\" (UID: \"ab714fa9-cc35-46b7-9076-b5623bd67831\") " pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.466481 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.466462 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:05.466524 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.466488 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:05.466524 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.466503 2571 projected.go:194] Error preparing data for projected volume kube-api-access-gshbt for pod openshift-network-diagnostics/network-check-target-8rrlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:05.466589 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.466570 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt podName:9c2a48b6-a8f6-4813-bb00-957aa2486e5e nodeName:}" failed. No retries permitted until 2026-04-23 17:41:05.966547137 +0000 UTC m=+2.045119115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gshbt" (UniqueName: "kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt") pod "network-check-target-8rrlh" (UID: "9c2a48b6-a8f6-4813-bb00-957aa2486e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:05.468383 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.468363 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6r7s\" (UniqueName: \"kubernetes.io/projected/de5b0280-3d7b-48bb-b05d-befd13392325-kube-api-access-h6r7s\") pod \"ovnkube-node-cqkbt\" (UID: \"de5b0280-3d7b-48bb-b05d-befd13392325\") " pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.473154 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.473131 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr8m6\" (UniqueName: \"kubernetes.io/projected/adddecdf-d1af-4726-baac-6b7ff1828f40-kube-api-access-gr8m6\") pod \"multus-additional-cni-plugins-t79h9\" (UID: \"adddecdf-d1af-4726-baac-6b7ff1828f40\") " pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.473449 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.473430 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fz5\" (UniqueName: \"kubernetes.io/projected/4e3268bc-024b-451b-9f80-42d8dd401c8e-kube-api-access-r2fz5\") pod \"tuned-7tx9v\" (UID: \"4e3268bc-024b-451b-9f80-42d8dd401c8e\") " pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.480607 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.480585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsvv5\" (UniqueName: \"kubernetes.io/projected/0f79f8f5-4fe2-436d-82d4-a94a0d58ba45-kube-api-access-zsvv5\") pod \"iptables-alerter-spvtt\" (UID: \"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45\") " pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.482557 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.482518 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e393938119b34b55859b23c603acc7.slice/crio-98bf2109a0cb0b61a1288cb76363749ab01e54790710b067da65d8d2ca4cd160 WatchSource:0}: Error finding container 98bf2109a0cb0b61a1288cb76363749ab01e54790710b067da65d8d2ca4cd160: Status 404 returned error can't find the container with id 98bf2109a0cb0b61a1288cb76363749ab01e54790710b067da65d8d2ca4cd160 Apr 23 17:41:05.482837 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.482816 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c5ddc418bd1d17b7357c5c1274f08d1.slice/crio-b9e1bed4557f1ac3d42b46760aec98d47d24c58a1198f2cfd42a10b55dbbcf00 WatchSource:0}: Error finding container b9e1bed4557f1ac3d42b46760aec98d47d24c58a1198f2cfd42a10b55dbbcf00: Status 404 returned error can't find the container with id b9e1bed4557f1ac3d42b46760aec98d47d24c58a1198f2cfd42a10b55dbbcf00 Apr 23 17:41:05.487626 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.487610 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:41:05.490284 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.490242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" event={"ID":"4c5ddc418bd1d17b7357c5c1274f08d1","Type":"ContainerStarted","Data":"b9e1bed4557f1ac3d42b46760aec98d47d24c58a1198f2cfd42a10b55dbbcf00"} Apr 23 17:41:05.491226 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.491206 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal" event={"ID":"40e393938119b34b55859b23c603acc7","Type":"ContainerStarted","Data":"98bf2109a0cb0b61a1288cb76363749ab01e54790710b067da65d8d2ca4cd160"} Apr 23 17:41:05.564109 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-sys-fs\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564109 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/929358d6-5b4a-49e4-a824-b3fbdf245db8-agent-certs\") pod \"konnectivity-agent-79jpg\" (UID: \"929358d6-5b4a-49e4-a824-b3fbdf245db8\") " pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:05.564359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dz5n2\" (UniqueName: \"kubernetes.io/projected/71c7aa65-427a-49ff-a818-134bb8e8549d-kube-api-access-dz5n2\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564249 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:05.564359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564274 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gl7d\" (UniqueName: \"kubernetes.io/projected/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-kube-api-access-7gl7d\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:05.564359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-registration-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-device-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f-host\") pod \"node-ca-sdzf6\" (UID: \"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f\") " pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.564359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564210 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-sys-fs\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-socket-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564413 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f-host\") pod \"node-ca-sdzf6\" (UID: \"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f\") " pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564420 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-device-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.564354 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-registration-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/929358d6-5b4a-49e4-a824-b3fbdf245db8-konnectivity-ca\") pod \"konnectivity-agent-79jpg\" (UID: \"929358d6-5b4a-49e4-a824-b3fbdf245db8\") " pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f-serviceca\") pod \"node-ca-sdzf6\" (UID: \"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f\") " pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.564509 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs podName:9d9e614d-f61d-4fcd-aaaf-ab97f54f2487 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:06.064489798 +0000 UTC m=+2.143061754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs") pod "network-metrics-daemon-qzv7h" (UID: "9d9e614d-f61d-4fcd-aaaf-ab97f54f2487") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6gl\" (UniqueName: \"kubernetes.io/projected/7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f-kube-api-access-zv6gl\") pod \"node-ca-sdzf6\" (UID: \"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f\") " pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564560 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-etc-selinux\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564598 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-socket-dir\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.564738 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564683 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/71c7aa65-427a-49ff-a818-134bb8e8549d-etc-selinux\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.565176 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f-serviceca\") pod \"node-ca-sdzf6\" (UID: \"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f\") " pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.565176 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.564931 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/929358d6-5b4a-49e4-a824-b3fbdf245db8-konnectivity-ca\") pod \"konnectivity-agent-79jpg\" (UID: \"929358d6-5b4a-49e4-a824-b3fbdf245db8\") " pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:05.566730 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.566712 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/929358d6-5b4a-49e4-a824-b3fbdf245db8-agent-certs\") pod \"konnectivity-agent-79jpg\" (UID: \"929358d6-5b4a-49e4-a824-b3fbdf245db8\") " pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:05.579133 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.579062 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz5n2\" (UniqueName: \"kubernetes.io/projected/71c7aa65-427a-49ff-a818-134bb8e8549d-kube-api-access-dz5n2\") pod \"aws-ebs-csi-driver-node-4khz8\" (UID: \"71c7aa65-427a-49ff-a818-134bb8e8549d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.579785 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.579750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gl7d\" (UniqueName: \"kubernetes.io/projected/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-kube-api-access-7gl7d\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:05.580974 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.580958 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6gl\" (UniqueName: \"kubernetes.io/projected/7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f-kube-api-access-zv6gl\") pod \"node-ca-sdzf6\" (UID: \"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f\") " pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.670223 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.670187 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" Apr 23 17:41:05.677254 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.677223 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e3268bc_024b_451b_9f80_42d8dd401c8e.slice/crio-a2b1d9bfde689f9b2f6fce0dcf63cc1c6f25ded611e07fbcaf7450d2a476cd8f WatchSource:0}: Error finding container a2b1d9bfde689f9b2f6fce0dcf63cc1c6f25ded611e07fbcaf7450d2a476cd8f: Status 404 returned error can't find the container with id a2b1d9bfde689f9b2f6fce0dcf63cc1c6f25ded611e07fbcaf7450d2a476cd8f Apr 23 17:41:05.688976 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.688948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4xrxl" Apr 23 17:41:05.694684 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.694655 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cxr7m" Apr 23 17:41:05.695373 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.695350 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab714fa9_cc35_46b7_9076_b5623bd67831.slice/crio-a383b8b05d823bca3aa0b46c07710d4076ac759f797563fb82d638244d674ee4 WatchSource:0}: Error finding container a383b8b05d823bca3aa0b46c07710d4076ac759f797563fb82d638244d674ee4: Status 404 returned error can't find the container with id a383b8b05d823bca3aa0b46c07710d4076ac759f797563fb82d638244d674ee4 Apr 23 17:41:05.702855 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.702816 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33946126_5bdc_4047_b09e_8ca68acdbd65.slice/crio-f48a11b086ce887c027eddeddea294a2bc71dbce516e3ac6a26ae7f70ea174d7 WatchSource:0}: Error finding container f48a11b086ce887c027eddeddea294a2bc71dbce516e3ac6a26ae7f70ea174d7: Status 404 returned error can't find the container with id f48a11b086ce887c027eddeddea294a2bc71dbce516e3ac6a26ae7f70ea174d7 Apr 23 17:41:05.705874 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.705857 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t79h9" Apr 23 17:41:05.711881 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.711859 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadddecdf_d1af_4726_baac_6b7ff1828f40.slice/crio-65ef4ca5b9d127331e682039dabe05f9bb3ac29bf8401cbf8b3fd44297e6e170 WatchSource:0}: Error finding container 65ef4ca5b9d127331e682039dabe05f9bb3ac29bf8401cbf8b3fd44297e6e170: Status 404 returned error can't find the container with id 65ef4ca5b9d127331e682039dabe05f9bb3ac29bf8401cbf8b3fd44297e6e170 Apr 23 17:41:05.724947 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.724921 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-spvtt" Apr 23 17:41:05.730843 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.730806 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f79f8f5_4fe2_436d_82d4_a94a0d58ba45.slice/crio-4c892b756a6249b421ab73e844eb2c3642c467747d59dc25ba4ee870f84bf5d9 WatchSource:0}: Error finding container 4c892b756a6249b421ab73e844eb2c3642c467747d59dc25ba4ee870f84bf5d9: Status 404 returned error can't find the container with id 4c892b756a6249b421ab73e844eb2c3642c467747d59dc25ba4ee870f84bf5d9 Apr 23 17:41:05.742324 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.742305 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:05.748848 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.748822 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5b0280_3d7b_48bb_b05d_befd13392325.slice/crio-70b57ab55a1a5ae6c4ef5b43ba0ceee97321774399c967777d617e5fe1e1cd80 WatchSource:0}: Error finding container 70b57ab55a1a5ae6c4ef5b43ba0ceee97321774399c967777d617e5fe1e1cd80: Status 404 returned error can't find the container with id 70b57ab55a1a5ae6c4ef5b43ba0ceee97321774399c967777d617e5fe1e1cd80 Apr 23 17:41:05.754163 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.754144 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:05.760794 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.760755 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:05.767729 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.767705 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod929358d6_5b4a_49e4_a824_b3fbdf245db8.slice/crio-56e70d9262c5e2cf5c01a62d4ee857cbeda9d981c7843c2c39afe38682bbeb60 WatchSource:0}: Error finding container 56e70d9262c5e2cf5c01a62d4ee857cbeda9d981c7843c2c39afe38682bbeb60: Status 404 returned error can't find the container with id 56e70d9262c5e2cf5c01a62d4ee857cbeda9d981c7843c2c39afe38682bbeb60 Apr 23 17:41:05.783004 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.782984 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sdzf6" Apr 23 17:41:05.788653 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.788633 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" Apr 23 17:41:05.790527 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.790503 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7acbf804_17f1_4ca5_8fe0_94cc15d6bf7f.slice/crio-09a24670bfcb12c735c7d970dd8d52b22a0b7d7b5a6b39731255b40bfe26c4b8 WatchSource:0}: Error finding container 09a24670bfcb12c735c7d970dd8d52b22a0b7d7b5a6b39731255b40bfe26c4b8: Status 404 returned error can't find the container with id 09a24670bfcb12c735c7d970dd8d52b22a0b7d7b5a6b39731255b40bfe26c4b8 Apr 23 17:41:05.795202 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:05.795172 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c7aa65_427a_49ff_a818_134bb8e8549d.slice/crio-89c2268579bc47882fced7eb82465cb7af4c0914318253053fe36a3b91ec44a8 WatchSource:0}: Error finding container 89c2268579bc47882fced7eb82465cb7af4c0914318253053fe36a3b91ec44a8: Status 404 returned error can't find the container with id 89c2268579bc47882fced7eb82465cb7af4c0914318253053fe36a3b91ec44a8 Apr 23 17:41:05.870381 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.870300 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:05.967111 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:05.967075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:05.967257 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.967225 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:05.967257 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.967245 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:05.967257 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.967254 2571 projected.go:194] Error preparing data for projected volume kube-api-access-gshbt for pod openshift-network-diagnostics/network-check-target-8rrlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:05.967433 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:05.967304 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt podName:9c2a48b6-a8f6-4813-bb00-957aa2486e5e nodeName:}" failed. No retries permitted until 2026-04-23 17:41:06.967289717 +0000 UTC m=+3.045861676 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gshbt" (UniqueName: "kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt") pod "network-check-target-8rrlh" (UID: "9c2a48b6-a8f6-4813-bb00-957aa2486e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:06.068247 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.068206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:06.068448 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:06.068383 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:06.068492 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:06.068455 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs podName:9d9e614d-f61d-4fcd-aaaf-ab97f54f2487 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:07.068436202 +0000 UTC m=+3.147008172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs") pod "network-metrics-daemon-qzv7h" (UID: "9d9e614d-f61d-4fcd-aaaf-ab97f54f2487") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:06.160783 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.160518 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:06.402964 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.402912 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:36:05 +0000 UTC" deadline="2027-10-28 02:14:36.293350283 +0000 UTC" Apr 23 17:41:06.402964 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.402961 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13256h33m29.890393664s" Apr 23 17:41:06.432021 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.431939 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-npjnt"] Apr 23 17:41:06.432584 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.432558 2571 predicate.go:212] "Predicate failed on Pod" pod="kube-system/global-pull-secret-syncer-npjnt" err="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 23 17:41:06.432694 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.432584 2571 kubelet.go:2420] "Pod admission denied" podUID="28d56301-715a-48e7-a655-1d05db71bcc1" pod="kube-system/global-pull-secret-syncer-npjnt" reason="NodeAffinity" message="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 23 17:41:06.471934 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.471895 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-dbus\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:06.472118 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.471985 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:06.472118 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.472026 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-kubelet-config\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:06.500511 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.500469 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" event={"ID":"71c7aa65-427a-49ff-a818-134bb8e8549d","Type":"ContainerStarted","Data":"89c2268579bc47882fced7eb82465cb7af4c0914318253053fe36a3b91ec44a8"} Apr 23 17:41:06.515975 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.515937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-79jpg" event={"ID":"929358d6-5b4a-49e4-a824-b3fbdf245db8","Type":"ContainerStarted","Data":"56e70d9262c5e2cf5c01a62d4ee857cbeda9d981c7843c2c39afe38682bbeb60"} Apr 23 17:41:06.534025 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.533991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cxr7m" event={"ID":"33946126-5bdc-4047-b09e-8ca68acdbd65","Type":"ContainerStarted","Data":"f48a11b086ce887c027eddeddea294a2bc71dbce516e3ac6a26ae7f70ea174d7"} Apr 23 17:41:06.545920 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.545882 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4xrxl" event={"ID":"ab714fa9-cc35-46b7-9076-b5623bd67831","Type":"ContainerStarted","Data":"a383b8b05d823bca3aa0b46c07710d4076ac759f797563fb82d638244d674ee4"} Apr 23 17:41:06.550287 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.550223 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" event={"ID":"4e3268bc-024b-451b-9f80-42d8dd401c8e","Type":"ContainerStarted","Data":"a2b1d9bfde689f9b2f6fce0dcf63cc1c6f25ded611e07fbcaf7450d2a476cd8f"} Apr 23 17:41:06.572260 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.572199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sdzf6" event={"ID":"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f","Type":"ContainerStarted","Data":"09a24670bfcb12c735c7d970dd8d52b22a0b7d7b5a6b39731255b40bfe26c4b8"} Apr 23 17:41:06.572483 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.572398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-dbus\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:06.572570 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.572551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:06.572646 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.572593 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-kubelet-config\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:06.572698 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.572687 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-kubelet-config\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:06.572845 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.572830 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-dbus\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:06.572947 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:06.572933 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:06.573001 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:06.572990 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret podName:28d56301-715a-48e7-a655-1d05db71bcc1 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:07.072972913 +0000 UTC m=+3.151544878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret") pod "global-pull-secret-syncer-npjnt" (UID: "28d56301-715a-48e7-a655-1d05db71bcc1") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:06.596869 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.596789 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerStarted","Data":"70b57ab55a1a5ae6c4ef5b43ba0ceee97321774399c967777d617e5fe1e1cd80"} Apr 23 17:41:06.603321 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.603251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-spvtt" event={"ID":"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45","Type":"ContainerStarted","Data":"4c892b756a6249b421ab73e844eb2c3642c467747d59dc25ba4ee870f84bf5d9"} Apr 23 17:41:06.622840 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.622808 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t79h9" event={"ID":"adddecdf-d1af-4726-baac-6b7ff1828f40","Type":"ContainerStarted","Data":"65ef4ca5b9d127331e682039dabe05f9bb3ac29bf8401cbf8b3fd44297e6e170"} Apr 23 17:41:06.672993 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.672960 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-kubelet-config\") pod \"28d56301-715a-48e7-a655-1d05db71bcc1\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " Apr 23 17:41:06.673152 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.673021 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-dbus\") pod \"28d56301-715a-48e7-a655-1d05db71bcc1\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " Apr 23 17:41:06.673241 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.673219 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-dbus" (OuterVolumeSpecName: "dbus") pod "28d56301-715a-48e7-a655-1d05db71bcc1" (UID: "28d56301-715a-48e7-a655-1d05db71bcc1"). InnerVolumeSpecName "dbus". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:41:06.673316 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.673269 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-kubelet-config" (OuterVolumeSpecName: "kubelet-config") pod "28d56301-715a-48e7-a655-1d05db71bcc1" (UID: "28d56301-715a-48e7-a655-1d05db71bcc1"). InnerVolumeSpecName "kubelet-config". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:41:06.774462 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.774403 2571 reconciler_common.go:299] "Volume detached for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-kubelet-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:41:06.774462 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.774435 2571 reconciler_common.go:299] "Volume detached for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28d56301-715a-48e7-a655-1d05db71bcc1-dbus\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:41:06.976705 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:06.976666 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:06.976894 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:06.976850 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:06.976894 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:06.976870 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:06.976894 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:06.976882 2571 projected.go:194] Error preparing data for projected volume kube-api-access-gshbt for pod openshift-network-diagnostics/network-check-target-8rrlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:06.977061 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:06.976942 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt podName:9c2a48b6-a8f6-4813-bb00-957aa2486e5e nodeName:}" failed. No retries permitted until 2026-04-23 17:41:08.976922933 +0000 UTC m=+5.055494892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gshbt" (UniqueName: "kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt") pod "network-check-target-8rrlh" (UID: "9c2a48b6-a8f6-4813-bb00-957aa2486e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:07.049613 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:07.049532 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:07.077816 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:07.077761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:07.077986 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:07.077870 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:07.078060 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:07.078000 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:07.078114 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:07.078068 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret podName:28d56301-715a-48e7-a655-1d05db71bcc1 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:08.078049332 +0000 UTC m=+4.156621307 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret") pod "global-pull-secret-syncer-npjnt" (UID: "28d56301-715a-48e7-a655-1d05db71bcc1") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:07.078165 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:07.078155 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:07.078208 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:07.078196 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs podName:9d9e614d-f61d-4fcd-aaaf-ab97f54f2487 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:09.078181718 +0000 UTC m=+5.156753697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs") pod "network-metrics-daemon-qzv7h" (UID: "9d9e614d-f61d-4fcd-aaaf-ab97f54f2487") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:07.403171 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:07.403071 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:36:05 +0000 UTC" deadline="2027-12-04 21:33:48.861583693 +0000 UTC" Apr 23 17:41:07.403171 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:07.403114 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14163h52m41.458473391s" Apr 23 17:41:07.489323 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:07.488560 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:07.489323 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:07.488690 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:07.489323 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:07.489138 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:07.489323 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:07.489242 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:08.087005 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:08.086969 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:08.087182 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:08.087120 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:08.087263 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:08.087184 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret podName:28d56301-715a-48e7-a655-1d05db71bcc1 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:10.087164306 +0000 UTC m=+6.165736262 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret") pod "global-pull-secret-syncer-npjnt" (UID: "28d56301-715a-48e7-a655-1d05db71bcc1") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:08.995437 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:08.994686 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:08.995437 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:08.994898 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:08.995437 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:08.994919 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:08.995437 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:08.994932 2571 projected.go:194] Error preparing data for projected volume kube-api-access-gshbt for pod openshift-network-diagnostics/network-check-target-8rrlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:08.995437 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:08.994994 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt podName:9c2a48b6-a8f6-4813-bb00-957aa2486e5e nodeName:}" failed. No retries permitted until 2026-04-23 17:41:12.994974243 +0000 UTC m=+9.073546211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gshbt" (UniqueName: "kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt") pod "network-check-target-8rrlh" (UID: "9c2a48b6-a8f6-4813-bb00-957aa2486e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:09.095808 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:09.095757 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:09.095998 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:09.095947 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:09.096061 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:09.096015 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs podName:9d9e614d-f61d-4fcd-aaaf-ab97f54f2487 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:13.095998099 +0000 UTC m=+9.174570055 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs") pod "network-metrics-daemon-qzv7h" (UID: "9d9e614d-f61d-4fcd-aaaf-ab97f54f2487") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:09.488524 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:09.488490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:09.488691 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:09.488624 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:09.489077 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:09.489054 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:09.489184 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:09.489163 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:10.105609 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:10.105566 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:10.106073 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:10.105706 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:10.106073 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:10.105761 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret podName:28d56301-715a-48e7-a655-1d05db71bcc1 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:14.105747987 +0000 UTC m=+10.184319947 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret") pod "global-pull-secret-syncer-npjnt" (UID: "28d56301-715a-48e7-a655-1d05db71bcc1") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:11.488870 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:11.488148 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:11.488870 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:11.488293 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:11.488870 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:11.488726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:11.488870 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:11.488832 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:13.033446 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:13.033338 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:13.033929 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:13.033538 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:13.033929 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:13.033557 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:13.033929 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:13.033571 2571 projected.go:194] Error preparing data for projected volume kube-api-access-gshbt for pod openshift-network-diagnostics/network-check-target-8rrlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:13.033929 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:13.033629 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt podName:9c2a48b6-a8f6-4813-bb00-957aa2486e5e nodeName:}" failed. No retries permitted until 2026-04-23 17:41:21.033611853 +0000 UTC m=+17.112183813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gshbt" (UniqueName: "kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt") pod "network-check-target-8rrlh" (UID: "9c2a48b6-a8f6-4813-bb00-957aa2486e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:13.134391 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:13.134352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:13.134562 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:13.134516 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:13.134652 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:13.134566 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs podName:9d9e614d-f61d-4fcd-aaaf-ab97f54f2487 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:21.134552992 +0000 UTC m=+17.213124947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs") pod "network-metrics-daemon-qzv7h" (UID: "9d9e614d-f61d-4fcd-aaaf-ab97f54f2487") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:13.488318 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:13.488239 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:13.488497 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:13.488385 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:13.489029 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:13.488762 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:13.489029 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:13.488863 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:14.141722 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.141686 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret\") pod \"global-pull-secret-syncer-npjnt\" (UID: \"28d56301-715a-48e7-a655-1d05db71bcc1\") " pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:14.142167 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:14.141863 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:14.142167 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:14.141924 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret podName:28d56301-715a-48e7-a655-1d05db71bcc1 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:22.141907668 +0000 UTC m=+18.220479624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret") pod "global-pull-secret-syncer-npjnt" (UID: "28d56301-715a-48e7-a655-1d05db71bcc1") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:14.511617 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.511582 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kube-system/global-pull-secret-syncer-npjnt"] Apr 23 17:41:14.511828 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.511691 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-npjnt" Apr 23 17:41:14.515258 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.515221 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kube-system/global-pull-secret-syncer-npjnt"] Apr 23 17:41:14.517258 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.517223 2571 status_manager.go:895] "Failed to get status for pod" podUID="28d56301-715a-48e7-a655-1d05db71bcc1" pod="kube-system/global-pull-secret-syncer-npjnt" err="pods \"global-pull-secret-syncer-npjnt\" is forbidden: User \"system:node:ip-10-0-138-17.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-138-17.ec2.internal' and this object" Apr 23 17:41:14.535197 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.535170 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hqlck"] Apr 23 17:41:14.541433 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.541411 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:14.541570 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:14.541499 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:14.552948 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.552905 2571 status_manager.go:895] "Failed to get status for pod" podUID="28d56301-715a-48e7-a655-1d05db71bcc1" pod="kube-system/global-pull-secret-syncer-npjnt" err="pods \"global-pull-secret-syncer-npjnt\" is forbidden: User \"system:node:ip-10-0-138-17.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-138-17.ec2.internal' and this object" Apr 23 17:41:14.645574 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.645540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-dbus\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:14.645756 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.645601 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-kubelet-config\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:14.645756 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.645675 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:14.646055 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.646033 2571 reconciler_common.go:299] "Volume detached for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d56301-715a-48e7-a655-1d05db71bcc1-original-pull-secret\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:41:14.747112 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.746458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-kubelet-config\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:14.747112 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.746510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:14.747112 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.746575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-dbus\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:14.747112 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.746665 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-dbus\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:14.747112 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:14.746718 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-kubelet-config\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:14.747112 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:14.746824 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:14.747112 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:14.746880 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret podName:e92c79cf-5cbc-4aab-b574-cecf28cb3b0f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:15.246862221 +0000 UTC m=+11.325434188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret") pod "global-pull-secret-syncer-hqlck" (UID: "e92c79cf-5cbc-4aab-b574-cecf28cb3b0f") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:15.250404 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:15.250369 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:15.250900 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:15.250539 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:15.250900 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:15.250597 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret podName:e92c79cf-5cbc-4aab-b574-cecf28cb3b0f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:16.250582458 +0000 UTC m=+12.329154419 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret") pod "global-pull-secret-syncer-hqlck" (UID: "e92c79cf-5cbc-4aab-b574-cecf28cb3b0f") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:15.488527 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:15.488498 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:15.488710 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:15.488629 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:15.489102 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:15.489086 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:15.489213 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:15.489197 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:16.258162 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:16.258121 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:16.258588 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:16.258285 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:16.258588 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:16.258354 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret podName:e92c79cf-5cbc-4aab-b574-cecf28cb3b0f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:18.25833469 +0000 UTC m=+14.336906646 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret") pod "global-pull-secret-syncer-hqlck" (UID: "e92c79cf-5cbc-4aab-b574-cecf28cb3b0f") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:16.488759 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:16.488722 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:16.488936 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:16.488871 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:17.488575 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:17.488546 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:17.489089 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:17.488545 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:17.489089 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:17.488661 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:17.489089 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:17.488754 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:18.270739 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:18.270550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:18.270739 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:18.270719 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:18.270996 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:18.270803 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret podName:e92c79cf-5cbc-4aab-b574-cecf28cb3b0f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:22.270787065 +0000 UTC m=+18.349359035 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret") pod "global-pull-secret-syncer-hqlck" (UID: "e92c79cf-5cbc-4aab-b574-cecf28cb3b0f") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:18.488122 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:18.488087 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:18.488267 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:18.488199 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:19.488293 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:19.488248 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:19.488293 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:19.488301 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:19.488809 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:19.488393 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:19.488809 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:19.488522 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:20.487983 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:20.487876 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:20.488152 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:20.487994 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:21.091144 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:21.091103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:21.091649 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:21.091253 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:21.091649 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:21.091267 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:21.091649 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:21.091278 2571 projected.go:194] Error preparing data for projected volume kube-api-access-gshbt for pod openshift-network-diagnostics/network-check-target-8rrlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:21.091649 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:21.091327 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt podName:9c2a48b6-a8f6-4813-bb00-957aa2486e5e nodeName:}" failed. No retries permitted until 2026-04-23 17:41:37.091314832 +0000 UTC m=+33.169886788 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gshbt" (UniqueName: "kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt") pod "network-check-target-8rrlh" (UID: "9c2a48b6-a8f6-4813-bb00-957aa2486e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:21.192217 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:21.192175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:21.192366 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:21.192346 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:21.192426 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:21.192419 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs podName:9d9e614d-f61d-4fcd-aaaf-ab97f54f2487 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:37.192402313 +0000 UTC m=+33.270974269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs") pod "network-metrics-daemon-qzv7h" (UID: "9d9e614d-f61d-4fcd-aaaf-ab97f54f2487") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:21.488199 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:21.488110 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:21.488355 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:21.488110 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:21.488355 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:21.488259 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:21.488355 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:21.488328 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:22.300758 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:22.300720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:22.301166 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:22.300888 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:22.301166 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:22.300955 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret podName:e92c79cf-5cbc-4aab-b574-cecf28cb3b0f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:30.300941525 +0000 UTC m=+26.379513485 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret") pod "global-pull-secret-syncer-hqlck" (UID: "e92c79cf-5cbc-4aab-b574-cecf28cb3b0f") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:22.488421 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:22.488382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:22.488603 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:22.488524 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:23.488464 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:23.488426 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:23.488464 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:23.488468 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:23.489009 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:23.488571 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:23.489009 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:23.488693 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:24.488760 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.488732 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:24.489684 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:24.488852 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:24.660083 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.660053 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 17:41:24.660904 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.660787 2571 generic.go:358] "Generic (PLEG): container finished" podID="de5b0280-3d7b-48bb-b05d-befd13392325" containerID="5b1acdedf6eb952a02443b99a0fd084f4862616be709a315d9fb4c9ee5a92b3b" exitCode=1 Apr 23 17:41:24.660904 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.660890 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerStarted","Data":"ebf2fe0d253ec54c6b935fd1653b7a3e7150d994d58fd3c9aa43e0620cee11a6"} Apr 23 17:41:24.661049 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.660924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerStarted","Data":"d3d1df7511bf406979b38f07e2b7309f4364cb3f3f461493f247da29e3fa750c"} Apr 23 17:41:24.661049 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.660934 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerStarted","Data":"5bc6d0316455db473a29b25c875b9f6a0da154265300d9d5d98095d6e1a48f86"} Apr 23 17:41:24.661049 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.660942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerStarted","Data":"f310079f98c66c4cc4be909adafa4f33325be7f5eec8226e69ed47891a18b562"} Apr 23 17:41:24.661049 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.660950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerDied","Data":"5b1acdedf6eb952a02443b99a0fd084f4862616be709a315d9fb4c9ee5a92b3b"} Apr 23 17:41:24.661049 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.660961 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerStarted","Data":"bc7ca3dfa818d39a678f12cdaa1c90acb5d78c35b806a9eefaecd634be9786e4"} Apr 23 17:41:24.664188 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.664159 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4xrxl" event={"ID":"ab714fa9-cc35-46b7-9076-b5623bd67831","Type":"ContainerStarted","Data":"feaaf3285403c07ff56fe6bcb0de8f309af671d68e2a7f83cd60400d3978dd80"} Apr 23 17:41:24.665495 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.665476 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" event={"ID":"4e3268bc-024b-451b-9f80-42d8dd401c8e","Type":"ContainerStarted","Data":"541ef51a74e3bd4d94c5ce77d7ffffe541cfb4abed352bf1cc40bdf3a01a42c4"} Apr 23 17:41:24.667661 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.667640 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal" event={"ID":"40e393938119b34b55859b23c603acc7","Type":"ContainerStarted","Data":"caf2d6d796513180fad0c5fadb2da220776fef309c351042fee0bb7bb857401b"} Apr 23 17:41:24.679317 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.679258 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4xrxl" podStartSLOduration=2.324507176 podStartE2EDuration="20.679242905s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:41:05.698062437 +0000 UTC m=+1.776634406" lastFinishedPulling="2026-04-23 17:41:24.052798179 +0000 UTC m=+20.131370135" observedRunningTime="2026-04-23 17:41:24.678543582 +0000 UTC m=+20.757115561" watchObservedRunningTime="2026-04-23 17:41:24.679242905 +0000 UTC m=+20.757814884" Apr 23 17:41:24.691759 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.691711 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7tx9v" podStartSLOduration=2.317238701 podStartE2EDuration="20.6916947s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:41:05.678881473 +0000 UTC m=+1.757453433" lastFinishedPulling="2026-04-23 17:41:24.053337469 +0000 UTC m=+20.131909432" observedRunningTime="2026-04-23 17:41:24.691591913 +0000 UTC m=+20.770163892" watchObservedRunningTime="2026-04-23 17:41:24.6916947 +0000 UTC m=+20.770266679" Apr 23 17:41:24.705027 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:24.704742 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-17.ec2.internal" podStartSLOduration=19.704724657 podStartE2EDuration="19.704724657s" podCreationTimestamp="2026-04-23 17:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:41:24.704337262 +0000 UTC m=+20.782909242" watchObservedRunningTime="2026-04-23 17:41:24.704724657 +0000 UTC m=+20.783296635" Apr 23 17:41:25.487897 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.487705 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:25.488047 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.487705 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:25.488047 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:25.487978 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:25.488124 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:25.488093 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:25.629929 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.629907 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:41:25.670754 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.670725 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cxr7m" event={"ID":"33946126-5bdc-4047-b09e-8ca68acdbd65","Type":"ContainerStarted","Data":"6a5553b97a6a074c050856dc2a257b2b093ae6db6964dfb90d56825a2f362726"} Apr 23 17:41:25.672079 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.672052 2571 generic.go:358] "Generic (PLEG): container finished" podID="4c5ddc418bd1d17b7357c5c1274f08d1" containerID="bf98c2c0681ffc48da4fb47cf03b4daec81bf57d602b6fbfb5ff496877ed61b4" exitCode=0 Apr 23 17:41:25.672180 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.672129 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" event={"ID":"4c5ddc418bd1d17b7357c5c1274f08d1","Type":"ContainerDied","Data":"bf98c2c0681ffc48da4fb47cf03b4daec81bf57d602b6fbfb5ff496877ed61b4"} Apr 23 17:41:25.672244 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.672225 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" Apr 23 17:41:25.673303 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.673280 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sdzf6" event={"ID":"7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f","Type":"ContainerStarted","Data":"f19344bd1f096161b6ffa41c7705c7aa9daccbb9921602f26b4aa47e290c19f6"} Apr 23 17:41:25.674405 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.674372 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-spvtt" event={"ID":"0f79f8f5-4fe2-436d-82d4-a94a0d58ba45","Type":"ContainerStarted","Data":"586f01bbc7c1bc7db0bc71ead4611beb9159284a14e69a0694f74ca75af7c4a3"} Apr 23 17:41:25.675612 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.675590 2571 generic.go:358] "Generic (PLEG): container finished" podID="adddecdf-d1af-4726-baac-6b7ff1828f40" containerID="0d6a444722884830355d78ef05417221d76fddb089d0f2b4f91ccc06103b5ec7" exitCode=0 Apr 23 17:41:25.675686 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.675669 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t79h9" event={"ID":"adddecdf-d1af-4726-baac-6b7ff1828f40","Type":"ContainerDied","Data":"0d6a444722884830355d78ef05417221d76fddb089d0f2b4f91ccc06103b5ec7"} Apr 23 17:41:25.677750 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.677715 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" event={"ID":"71c7aa65-427a-49ff-a818-134bb8e8549d","Type":"ContainerStarted","Data":"7c68377fed18de605bd6c1dff9e3f8ce61a7d49e0576c746a711c4e304d692c6"} Apr 23 17:41:25.677750 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.677742 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" event={"ID":"71c7aa65-427a-49ff-a818-134bb8e8549d","Type":"ContainerStarted","Data":"0a8b82a65892da11d5a10dd33ae75f7932a301bab7378ef267069f697d1968f7"} Apr 23 17:41:25.678919 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.678902 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-79jpg" event={"ID":"929358d6-5b4a-49e4-a824-b3fbdf245db8","Type":"ContainerStarted","Data":"d439bb34d0ff082596ec6436aa50815a4218b0219891c58dd045597bf015eda2"} Apr 23 17:41:25.688481 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.688462 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:41:25.688932 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.688867 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal"] Apr 23 17:41:25.718741 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.718698 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cxr7m" podStartSLOduration=3.370564547 podStartE2EDuration="21.718685224s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:41:05.704450716 +0000 UTC m=+1.783022672" lastFinishedPulling="2026-04-23 17:41:24.052571379 +0000 UTC m=+20.131143349" observedRunningTime="2026-04-23 17:41:25.696161794 +0000 UTC m=+21.774733773" watchObservedRunningTime="2026-04-23 17:41:25.718685224 +0000 UTC m=+21.797257203" Apr 23 17:41:25.737852 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.737761 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-spvtt" podStartSLOduration=3.419111529 podStartE2EDuration="21.737749448s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:41:05.732304755 +0000 UTC m=+1.810876711" lastFinishedPulling="2026-04-23 17:41:24.050942665 +0000 UTC m=+20.129514630" observedRunningTime="2026-04-23 17:41:25.718399891 +0000 UTC m=+21.796971882" watchObservedRunningTime="2026-04-23 17:41:25.737749448 +0000 UTC m=+21.816321432" Apr 23 17:41:25.755248 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.755197 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-79jpg" podStartSLOduration=3.473591893 podStartE2EDuration="21.755180542s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:41:05.769298405 +0000 UTC m=+1.847870364" lastFinishedPulling="2026-04-23 17:41:24.050887047 +0000 UTC m=+20.129459013" observedRunningTime="2026-04-23 17:41:25.738188982 +0000 UTC m=+21.816760961" watchObservedRunningTime="2026-04-23 17:41:25.755180542 +0000 UTC m=+21.833752522" Apr 23 17:41:25.776248 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:25.776204 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sdzf6" podStartSLOduration=3.51732277 podStartE2EDuration="21.776187446s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:41:05.792050425 +0000 UTC m=+1.870622380" lastFinishedPulling="2026-04-23 17:41:24.050915085 +0000 UTC m=+20.129487056" observedRunningTime="2026-04-23 17:41:25.775650642 +0000 UTC m=+21.854222617" watchObservedRunningTime="2026-04-23 17:41:25.776187446 +0000 UTC m=+21.854759425" Apr 23 17:41:26.140922 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.140393 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:26.141763 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.141738 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:26.421238 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.421141 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:41:25.629924179Z","UUID":"8a9ee4ba-d996-4335-8b93-4d7cabd12012","Handler":null,"Name":"","Endpoint":""} Apr 23 17:41:26.423550 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.423524 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:41:26.423665 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.423562 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:41:26.488089 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.488056 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:26.488269 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:26.488185 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:26.685992 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.684153 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" event={"ID":"71c7aa65-427a-49ff-a818-134bb8e8549d","Type":"ContainerStarted","Data":"e601b4400e81fea369297738ed96e56eb96646ccf638270aa86663312006aaa6"} Apr 23 17:41:26.688425 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.688401 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" event={"ID":"4c5ddc418bd1d17b7357c5c1274f08d1","Type":"ContainerStarted","Data":"2250178cb4b8694d7bb93320c1333a0866ed8e12ea18e288ee5547c0bb2709bf"} Apr 23 17:41:26.688641 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.688480 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:26.689183 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.689151 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-79jpg" Apr 23 17:41:26.698763 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:26.698715 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4khz8" podStartSLOduration=2.159205759 podStartE2EDuration="22.698701188s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:41:05.796701945 +0000 UTC m=+1.875273905" lastFinishedPulling="2026-04-23 17:41:26.336197172 +0000 UTC m=+22.414769334" observedRunningTime="2026-04-23 17:41:26.698507223 +0000 UTC m=+22.777079239" watchObservedRunningTime="2026-04-23 17:41:26.698701188 +0000 UTC m=+22.777273166" Apr 23 17:41:27.487722 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:27.487690 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:27.487932 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:27.487694 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:27.487932 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:27.487835 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:27.487932 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:27.487906 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:27.693179 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:27.693148 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 17:41:27.693652 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:27.693614 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerStarted","Data":"40bac37c0b0d2a94a78ff3a5c75de9eef06eb2f7a1be4a540de70c40007f09c4"} Apr 23 17:41:28.488258 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:28.488221 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:28.488458 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:28.488363 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:29.488758 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:29.488576 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:29.489238 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:29.488576 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:29.489238 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:29.488877 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:29.489238 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:29.489035 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:30.362737 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.362568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:30.362926 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:30.362710 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:30.362926 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:30.362833 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret podName:e92c79cf-5cbc-4aab-b574-cecf28cb3b0f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:46.362815627 +0000 UTC m=+42.441387590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret") pod "global-pull-secret-syncer-hqlck" (UID: "e92c79cf-5cbc-4aab-b574-cecf28cb3b0f") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:41:30.488038 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.488005 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:30.488181 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:30.488125 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:30.701211 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.701115 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 17:41:30.701982 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.701468 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerStarted","Data":"ac024f886ea82e2b2509ece06f74c3f7ac02ce9f509996826ff2849d3c4a5b17"} Apr 23 17:41:30.701982 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.701814 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:30.702090 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.702002 2571 scope.go:117] "RemoveContainer" containerID="5b1acdedf6eb952a02443b99a0fd084f4862616be709a315d9fb4c9ee5a92b3b" Apr 23 17:41:30.703286 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.703226 2571 generic.go:358] "Generic (PLEG): container finished" podID="adddecdf-d1af-4726-baac-6b7ff1828f40" containerID="4c91221d789df29a8b0aad82aee6c6d704035effc71bafad61faf2c8e9ba707a" exitCode=0 Apr 23 17:41:30.703417 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.703317 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t79h9" event={"ID":"adddecdf-d1af-4726-baac-6b7ff1828f40","Type":"ContainerDied","Data":"4c91221d789df29a8b0aad82aee6c6d704035effc71bafad61faf2c8e9ba707a"} Apr 23 17:41:30.717814 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.717793 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:30.723304 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:30.723265 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-17.ec2.internal" podStartSLOduration=5.7232544149999995 podStartE2EDuration="5.723254415s" podCreationTimestamp="2026-04-23 17:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:41:26.728188533 +0000 UTC m=+22.806760513" watchObservedRunningTime="2026-04-23 17:41:30.723254415 +0000 UTC m=+26.801826393" Apr 23 17:41:31.488594 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.488567 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:31.488743 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.488567 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:31.488743 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:31.488663 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:31.488843 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:31.488746 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:31.647986 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.647947 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qzv7h"] Apr 23 17:41:31.648403 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.648380 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8rrlh"] Apr 23 17:41:31.650447 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.650395 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hqlck"] Apr 23 17:41:31.650546 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.650533 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:31.650658 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:31.650632 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:31.707341 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.707307 2571 generic.go:358] "Generic (PLEG): container finished" podID="adddecdf-d1af-4726-baac-6b7ff1828f40" containerID="01104698fb25e553302fb51b816163cada567fdc446a63a6fdd52a452a33d03b" exitCode=0 Apr 23 17:41:31.707756 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.707405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t79h9" event={"ID":"adddecdf-d1af-4726-baac-6b7ff1828f40","Type":"ContainerDied","Data":"01104698fb25e553302fb51b816163cada567fdc446a63a6fdd52a452a33d03b"} Apr 23 17:41:31.710820 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.710802 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 17:41:31.711191 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.711165 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" event={"ID":"de5b0280-3d7b-48bb-b05d-befd13392325","Type":"ContainerStarted","Data":"ffabd79400bdf5272c7f35af8f3f10a551738982d211dace58058a467a4e751e"} Apr 23 17:41:31.711277 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.711221 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:31.711339 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.711225 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:31.711380 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:31.711344 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:31.711472 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.711456 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:31.711684 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.711496 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:31.711684 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:31.711624 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:31.727129 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.727104 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:41:31.747086 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:31.747024 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" podStartSLOduration=9.349909588 podStartE2EDuration="27.74700196s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:41:05.750389075 +0000 UTC m=+1.828961034" lastFinishedPulling="2026-04-23 17:41:24.147481445 +0000 UTC m=+20.226053406" observedRunningTime="2026-04-23 17:41:31.747003987 +0000 UTC m=+27.825575966" watchObservedRunningTime="2026-04-23 17:41:31.74700196 +0000 UTC m=+27.825573939" Apr 23 17:41:32.715645 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:32.715557 2571 generic.go:358] "Generic (PLEG): container finished" podID="adddecdf-d1af-4726-baac-6b7ff1828f40" containerID="b1ee78dc08619c672a3caa0abe3dd6d6d9865af22ff777bf47bd0ebb1d7c24d0" exitCode=0 Apr 23 17:41:32.716001 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:32.715653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t79h9" event={"ID":"adddecdf-d1af-4726-baac-6b7ff1828f40","Type":"ContainerDied","Data":"b1ee78dc08619c672a3caa0abe3dd6d6d9865af22ff777bf47bd0ebb1d7c24d0"} Apr 23 17:41:33.487939 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:33.487847 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:33.487939 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:33.487883 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:33.487939 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:33.487903 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:33.488215 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:33.488014 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:33.488215 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:33.488118 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:33.488302 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:33.488212 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:35.488737 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:35.488518 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:35.489228 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:35.488594 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:35.489228 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:35.488853 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqlck" podUID="e92c79cf-5cbc-4aab-b574-cecf28cb3b0f" Apr 23 17:41:35.489228 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:35.488915 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8rrlh" podUID="9c2a48b6-a8f6-4813-bb00-957aa2486e5e" Apr 23 17:41:35.489228 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:35.488630 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:35.489228 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:35.488993 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:41:37.112155 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.112107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:37.112595 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.112294 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:37.112595 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.112322 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:37.112595 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.112336 2571 projected.go:194] Error preparing data for projected volume kube-api-access-gshbt for pod openshift-network-diagnostics/network-check-target-8rrlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:37.112595 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.112400 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt podName:9c2a48b6-a8f6-4813-bb00-957aa2486e5e nodeName:}" failed. No retries permitted until 2026-04-23 17:42:09.112386235 +0000 UTC m=+65.190958191 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gshbt" (UniqueName: "kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt") pod "network-check-target-8rrlh" (UID: "9c2a48b6-a8f6-4813-bb00-957aa2486e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:37.213296 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.213262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:37.213476 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.213437 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:37.213549 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.213524 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs podName:9d9e614d-f61d-4fcd-aaaf-ab97f54f2487 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:09.213502756 +0000 UTC m=+65.292074726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs") pod "network-metrics-daemon-qzv7h" (UID: "9d9e614d-f61d-4fcd-aaaf-ab97f54f2487") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:37.267789 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.267747 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-17.ec2.internal" event="NodeReady" Apr 23 17:41:37.267966 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.267900 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 17:41:37.301715 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.301681 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp"] Apr 23 17:41:37.339257 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.339229 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d77897766-fz4ms"] Apr 23 17:41:37.339428 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.339325 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" Apr 23 17:41:37.342207 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.341973 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 17:41:37.342207 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.342068 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 17:41:37.342207 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.342112 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 17:41:37.342491 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.342473 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-ffqzs\"" Apr 23 17:41:37.342817 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.342796 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 17:41:37.358102 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.358078 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4"] Apr 23 17:41:37.358267 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.358245 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.360557 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.360539 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 17:41:37.360993 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.360962 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 17:41:37.361105 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.361058 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7hx6w\"" Apr 23 17:41:37.361170 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.361109 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 17:41:37.365954 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.365901 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 17:41:37.377143 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.377119 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb"] Apr 23 17:41:37.377330 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.377312 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.379719 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.379694 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 17:41:37.398241 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.398217 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp"] Apr 23 17:41:37.398241 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.398248 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb"] Apr 23 17:41:37.398418 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.398259 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d77897766-fz4ms"] Apr 23 17:41:37.398418 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.398275 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4txc2"] Apr 23 17:41:37.398418 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.398368 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.401590 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.401412 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 17:41:37.401590 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.401490 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 17:41:37.401590 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.401490 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 17:41:37.401818 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.401493 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 17:41:37.414473 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.414449 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4txc2"] Apr 23 17:41:37.414473 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.414478 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4"] Apr 23 17:41:37.414637 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.414616 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:37.416908 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.416891 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:41:37.417639 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.417619 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:41:37.417749 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.417643 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bmxrf\"" Apr 23 17:41:37.417749 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.417681 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:41:37.432978 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.432934 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j4f7r"] Apr 23 17:41:37.454213 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.454186 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j4f7r"] Apr 23 17:41:37.454353 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.454339 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.456451 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.456426 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:41:37.456589 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.456431 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-b6vkm\"" Apr 23 17:41:37.456881 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.456861 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:41:37.488662 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.488634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:37.488885 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.488635 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:41:37.488947 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.488893 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:41:37.491036 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.490910 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:41:37.491036 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.490933 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qk5s8\"" Apr 23 17:41:37.491036 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.490944 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:41:37.491036 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.490945 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:41:37.491036 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.490910 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kjp2q\"" Apr 23 17:41:37.491277 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.490916 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:41:37.515058 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-registry-certificates\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.515174 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515071 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0204dafa-5b79-44f7-b2f8-75ffc5549db4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-ccd7b688f-jp4jp\" (UID: \"0204dafa-5b79-44f7-b2f8-75ffc5549db4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" Apr 23 17:41:37.515174 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515100 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-bound-sa-token\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.515174 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bd621488-e3c6-474f-8312-4fc8287e45e6-klusterlet-config\") pod \"klusterlet-addon-workmgr-77df67f946-r5dl4\" (UID: \"bd621488-e3c6-474f-8312-4fc8287e45e6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.515299 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515181 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2km\" (UniqueName: \"kubernetes.io/projected/9840772f-9282-45fa-a2c3-d4bfeab937c3-kube-api-access-8b2km\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:37.515299 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-installation-pull-secrets\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.515299 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515260 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.515299 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5052b556-e070-4527-add1-23d7af9bfa63-ca-trust-extracted\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.515436 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515316 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.515436 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7s98\" (UniqueName: \"kubernetes.io/projected/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-kube-api-access-s7s98\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.515436 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515368 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6thm\" (UniqueName: \"kubernetes.io/projected/bd621488-e3c6-474f-8312-4fc8287e45e6-kube-api-access-v6thm\") pod \"klusterlet-addon-workmgr-77df67f946-r5dl4\" (UID: \"bd621488-e3c6-474f-8312-4fc8287e45e6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.515436 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515410 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-image-registry-private-configuration\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.515567 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515435 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7kk8\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-kube-api-access-q7kk8\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.515567 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515463 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-trusted-ca\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.515567 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6dns\" (UniqueName: \"kubernetes.io/projected/0204dafa-5b79-44f7-b2f8-75ffc5549db4-kube-api-access-m6dns\") pod \"managed-serviceaccount-addon-agent-ccd7b688f-jp4jp\" (UID: \"0204dafa-5b79-44f7-b2f8-75ffc5549db4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" Apr 23 17:41:37.515567 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515518 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:37.515567 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515558 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.515805 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-ca\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.515805 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515623 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd621488-e3c6-474f-8312-4fc8287e45e6-tmp\") pod \"klusterlet-addon-workmgr-77df67f946-r5dl4\" (UID: \"bd621488-e3c6-474f-8312-4fc8287e45e6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.515805 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515661 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-hub\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.515805 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.515690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.616593 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616508 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.616593 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-registry-certificates\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616612 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0204dafa-5b79-44f7-b2f8-75ffc5549db4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-ccd7b688f-jp4jp\" (UID: \"0204dafa-5b79-44f7-b2f8-75ffc5549db4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616641 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-bound-sa-token\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.616643 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.616666 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d77897766-fz4ms: secret "image-registry-tls" not found Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616666 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bd621488-e3c6-474f-8312-4fc8287e45e6-klusterlet-config\") pod \"klusterlet-addon-workmgr-77df67f946-r5dl4\" (UID: \"bd621488-e3c6-474f-8312-4fc8287e45e6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2km\" (UniqueName: \"kubernetes.io/projected/9840772f-9282-45fa-a2c3-d4bfeab937c3-kube-api-access-8b2km\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.616732 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls podName:5052b556-e070-4527-add1-23d7af9bfa63 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:38.116710489 +0000 UTC m=+34.195282445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls") pod "image-registry-6d77897766-fz4ms" (UID: "5052b556-e070-4527-add1-23d7af9bfa63") : secret "image-registry-tls" not found Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616759 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvrl8\" (UniqueName: \"kubernetes.io/projected/245df2e7-7ac7-458b-a0e7-7f1121debc71-kube-api-access-zvrl8\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616820 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-installation-pull-secrets\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.616864 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5052b556-e070-4527-add1-23d7af9bfa63-ca-trust-extracted\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616904 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7s98\" (UniqueName: \"kubernetes.io/projected/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-kube-api-access-s7s98\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616956 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6thm\" (UniqueName: \"kubernetes.io/projected/bd621488-e3c6-474f-8312-4fc8287e45e6-kube-api-access-v6thm\") pod \"klusterlet-addon-workmgr-77df67f946-r5dl4\" (UID: \"bd621488-e3c6-474f-8312-4fc8287e45e6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.616997 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-image-registry-private-configuration\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617023 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7kk8\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-kube-api-access-q7kk8\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/245df2e7-7ac7-458b-a0e7-7f1121debc71-config-volume\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-trusted-ca\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6dns\" (UniqueName: \"kubernetes.io/projected/0204dafa-5b79-44f7-b2f8-75ffc5549db4-kube-api-access-m6dns\") pod \"managed-serviceaccount-addon-agent-ccd7b688f-jp4jp\" (UID: \"0204dafa-5b79-44f7-b2f8-75ffc5549db4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617170 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617228 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/245df2e7-7ac7-458b-a0e7-7f1121debc71-tmp-dir\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-registry-certificates\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617292 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-ca\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.617329 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd621488-e3c6-474f-8312-4fc8287e45e6-tmp\") pod \"klusterlet-addon-workmgr-77df67f946-r5dl4\" (UID: \"bd621488-e3c6-474f-8312-4fc8287e45e6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.618077 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.617365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-hub\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.619227 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.618172 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.619227 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.618901 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:37.619227 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.618966 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert podName:9840772f-9282-45fa-a2c3-d4bfeab937c3 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:38.118949371 +0000 UTC m=+34.197521329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert") pod "ingress-canary-4txc2" (UID: "9840772f-9282-45fa-a2c3-d4bfeab937c3") : secret "canary-serving-cert" not found Apr 23 17:41:37.619990 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.619962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5052b556-e070-4527-add1-23d7af9bfa63-ca-trust-extracted\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.620458 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.620395 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-trusted-ca\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.620458 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.620414 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd621488-e3c6-474f-8312-4fc8287e45e6-tmp\") pod \"klusterlet-addon-workmgr-77df67f946-r5dl4\" (UID: \"bd621488-e3c6-474f-8312-4fc8287e45e6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.622202 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.622179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-image-registry-private-configuration\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.622478 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.622454 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0204dafa-5b79-44f7-b2f8-75ffc5549db4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-ccd7b688f-jp4jp\" (UID: \"0204dafa-5b79-44f7-b2f8-75ffc5549db4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" Apr 23 17:41:37.622598 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.622582 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.622986 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.622962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-installation-pull-secrets\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.623075 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.622982 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.623710 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.623678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-hub\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.624402 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.624379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-ca\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.628010 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.627983 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-bound-sa-token\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.629278 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.629235 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6thm\" (UniqueName: \"kubernetes.io/projected/bd621488-e3c6-474f-8312-4fc8287e45e6-kube-api-access-v6thm\") pod \"klusterlet-addon-workmgr-77df67f946-r5dl4\" (UID: \"bd621488-e3c6-474f-8312-4fc8287e45e6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.629604 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.629552 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2km\" (UniqueName: \"kubernetes.io/projected/9840772f-9282-45fa-a2c3-d4bfeab937c3-kube-api-access-8b2km\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:37.630328 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.630304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7s98\" (UniqueName: \"kubernetes.io/projected/f7819252-f0cd-46a2-a9e8-b9a71ed7592a-kube-api-access-s7s98\") pod \"cluster-proxy-proxy-agent-8f68778c8-gnsjb\" (UID: \"f7819252-f0cd-46a2-a9e8-b9a71ed7592a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.632563 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.632510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bd621488-e3c6-474f-8312-4fc8287e45e6-klusterlet-config\") pod \"klusterlet-addon-workmgr-77df67f946-r5dl4\" (UID: \"bd621488-e3c6-474f-8312-4fc8287e45e6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.642287 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.642265 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7kk8\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-kube-api-access-q7kk8\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:37.642379 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.642282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6dns\" (UniqueName: \"kubernetes.io/projected/0204dafa-5b79-44f7-b2f8-75ffc5549db4-kube-api-access-m6dns\") pod \"managed-serviceaccount-addon-agent-ccd7b688f-jp4jp\" (UID: \"0204dafa-5b79-44f7-b2f8-75ffc5549db4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" Apr 23 17:41:37.665384 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.665355 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" Apr 23 17:41:37.687938 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.687904 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:37.708735 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.708698 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:41:37.718708 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.718681 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvrl8\" (UniqueName: \"kubernetes.io/projected/245df2e7-7ac7-458b-a0e7-7f1121debc71-kube-api-access-zvrl8\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.718882 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.718741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/245df2e7-7ac7-458b-a0e7-7f1121debc71-config-volume\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.718942 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.718924 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.719003 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.718978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/245df2e7-7ac7-458b-a0e7-7f1121debc71-tmp-dir\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.719070 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.719047 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:37.719173 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:37.719121 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls podName:245df2e7-7ac7-458b-a0e7-7f1121debc71 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:38.219101249 +0000 UTC m=+34.297673220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls") pod "dns-default-j4f7r" (UID: "245df2e7-7ac7-458b-a0e7-7f1121debc71") : secret "dns-default-metrics-tls" not found Apr 23 17:41:37.719250 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.719236 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/245df2e7-7ac7-458b-a0e7-7f1121debc71-tmp-dir\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.731412 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.729229 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/245df2e7-7ac7-458b-a0e7-7f1121debc71-config-volume\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:37.731412 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:37.730495 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvrl8\" (UniqueName: \"kubernetes.io/projected/245df2e7-7ac7-458b-a0e7-7f1121debc71-kube-api-access-zvrl8\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:38.122927 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.122884 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:38.123586 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.123006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:38.123586 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:38.123065 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:38.123586 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:38.123085 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d77897766-fz4ms: secret "image-registry-tls" not found Apr 23 17:41:38.123586 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:38.123122 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:38.123586 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:38.123155 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls podName:5052b556-e070-4527-add1-23d7af9bfa63 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:39.123133901 +0000 UTC m=+35.201705874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls") pod "image-registry-6d77897766-fz4ms" (UID: "5052b556-e070-4527-add1-23d7af9bfa63") : secret "image-registry-tls" not found Apr 23 17:41:38.123586 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:38.123177 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert podName:9840772f-9282-45fa-a2c3-d4bfeab937c3 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:39.123167349 +0000 UTC m=+35.201739305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert") pod "ingress-canary-4txc2" (UID: "9840772f-9282-45fa-a2c3-d4bfeab937c3") : secret "canary-serving-cert" not found Apr 23 17:41:38.223984 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.223949 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:38.224185 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:38.224138 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:38.224259 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:38.224223 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls podName:245df2e7-7ac7-458b-a0e7-7f1121debc71 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:39.224199861 +0000 UTC m=+35.302771837 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls") pod "dns-default-j4f7r" (UID: "245df2e7-7ac7-458b-a0e7-7f1121debc71") : secret "dns-default-metrics-tls" not found Apr 23 17:41:38.499954 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.499796 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4"] Apr 23 17:41:38.501729 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.501707 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp"] Apr 23 17:41:38.505162 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.505142 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb"] Apr 23 17:41:38.528627 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:38.528595 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd621488_e3c6_474f_8312_4fc8287e45e6.slice/crio-d83cafb6a66718dd66d3ad6790a555b541a3e34102491472eba1f84f4b15c518 WatchSource:0}: Error finding container d83cafb6a66718dd66d3ad6790a555b541a3e34102491472eba1f84f4b15c518: Status 404 returned error can't find the container with id d83cafb6a66718dd66d3ad6790a555b541a3e34102491472eba1f84f4b15c518 Apr 23 17:41:38.529210 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:38.529178 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0204dafa_5b79_44f7_b2f8_75ffc5549db4.slice/crio-203df2493805340d24bde6473f064e33f03c73a23e672782175a1fa1db08b94b WatchSource:0}: Error finding container 203df2493805340d24bde6473f064e33f03c73a23e672782175a1fa1db08b94b: Status 404 returned error can't find the container with id 203df2493805340d24bde6473f064e33f03c73a23e672782175a1fa1db08b94b Apr 23 17:41:38.529979 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:38.529954 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7819252_f0cd_46a2_a9e8_b9a71ed7592a.slice/crio-4024ee748a0860a17aa41f8a900f88fcf028987b53c64d89b3d9ee88f510a72e WatchSource:0}: Error finding container 4024ee748a0860a17aa41f8a900f88fcf028987b53c64d89b3d9ee88f510a72e: Status 404 returned error can't find the container with id 4024ee748a0860a17aa41f8a900f88fcf028987b53c64d89b3d9ee88f510a72e Apr 23 17:41:38.731824 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.731787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t79h9" event={"ID":"adddecdf-d1af-4726-baac-6b7ff1828f40","Type":"ContainerStarted","Data":"cd0b4385e89f21cb7172129107d68f9613b1b228e1d2567c373da748fe2eaa5b"} Apr 23 17:41:38.733004 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.732968 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" event={"ID":"bd621488-e3c6-474f-8312-4fc8287e45e6","Type":"ContainerStarted","Data":"d83cafb6a66718dd66d3ad6790a555b541a3e34102491472eba1f84f4b15c518"} Apr 23 17:41:38.734097 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.734060 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" event={"ID":"f7819252-f0cd-46a2-a9e8-b9a71ed7592a","Type":"ContainerStarted","Data":"4024ee748a0860a17aa41f8a900f88fcf028987b53c64d89b3d9ee88f510a72e"} Apr 23 17:41:38.735051 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:38.735016 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" event={"ID":"0204dafa-5b79-44f7-b2f8-75ffc5549db4","Type":"ContainerStarted","Data":"203df2493805340d24bde6473f064e33f03c73a23e672782175a1fa1db08b94b"} Apr 23 17:41:39.133791 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:39.133732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:39.134314 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:39.133837 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:39.134314 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:39.133884 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:39.134314 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:39.133932 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:39.134314 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:39.133944 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d77897766-fz4ms: secret "image-registry-tls" not found Apr 23 17:41:39.134314 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:39.133963 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert podName:9840772f-9282-45fa-a2c3-d4bfeab937c3 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:41.133935384 +0000 UTC m=+37.212507356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert") pod "ingress-canary-4txc2" (UID: "9840772f-9282-45fa-a2c3-d4bfeab937c3") : secret "canary-serving-cert" not found Apr 23 17:41:39.134314 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:39.133980 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls podName:5052b556-e070-4527-add1-23d7af9bfa63 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:41.133970498 +0000 UTC m=+37.212542455 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls") pod "image-registry-6d77897766-fz4ms" (UID: "5052b556-e070-4527-add1-23d7af9bfa63") : secret "image-registry-tls" not found Apr 23 17:41:39.234550 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:39.234515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:39.234693 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:39.234674 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:39.234755 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:39.234742 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls podName:245df2e7-7ac7-458b-a0e7-7f1121debc71 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:41.234725494 +0000 UTC m=+37.313297454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls") pod "dns-default-j4f7r" (UID: "245df2e7-7ac7-458b-a0e7-7f1121debc71") : secret "dns-default-metrics-tls" not found Apr 23 17:41:39.741981 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:39.741887 2571 generic.go:358] "Generic (PLEG): container finished" podID="adddecdf-d1af-4726-baac-6b7ff1828f40" containerID="cd0b4385e89f21cb7172129107d68f9613b1b228e1d2567c373da748fe2eaa5b" exitCode=0 Apr 23 17:41:39.741981 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:39.741942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t79h9" event={"ID":"adddecdf-d1af-4726-baac-6b7ff1828f40","Type":"ContainerDied","Data":"cd0b4385e89f21cb7172129107d68f9613b1b228e1d2567c373da748fe2eaa5b"} Apr 23 17:41:40.750183 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:40.750138 2571 generic.go:358] "Generic (PLEG): container finished" podID="adddecdf-d1af-4726-baac-6b7ff1828f40" containerID="d399e04d6f001ba9000154b8c392827088b8ee6bfa8f60d1a7d9a41e3020ec41" exitCode=0 Apr 23 17:41:40.750682 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:40.750209 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t79h9" event={"ID":"adddecdf-d1af-4726-baac-6b7ff1828f40","Type":"ContainerDied","Data":"d399e04d6f001ba9000154b8c392827088b8ee6bfa8f60d1a7d9a41e3020ec41"} Apr 23 17:41:41.150308 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:41.150176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:41.151259 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:41.150924 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:41.151259 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:41.150949 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d77897766-fz4ms: secret "image-registry-tls" not found Apr 23 17:41:41.151259 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:41.151017 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls podName:5052b556-e070-4527-add1-23d7af9bfa63 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:45.150996686 +0000 UTC m=+41.229568648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls") pod "image-registry-6d77897766-fz4ms" (UID: "5052b556-e070-4527-add1-23d7af9bfa63") : secret "image-registry-tls" not found Apr 23 17:41:41.151989 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:41.151929 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:41.152103 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:41.152075 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:41.152167 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:41.152142 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert podName:9840772f-9282-45fa-a2c3-d4bfeab937c3 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:45.15212472 +0000 UTC m=+41.230696683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert") pod "ingress-canary-4txc2" (UID: "9840772f-9282-45fa-a2c3-d4bfeab937c3") : secret "canary-serving-cert" not found Apr 23 17:41:41.253132 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:41.253091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:41.253379 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:41.253345 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:41.253504 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:41.253419 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls podName:245df2e7-7ac7-458b-a0e7-7f1121debc71 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:45.253398949 +0000 UTC m=+41.331970965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls") pod "dns-default-j4f7r" (UID: "245df2e7-7ac7-458b-a0e7-7f1121debc71") : secret "dns-default-metrics-tls" not found Apr 23 17:41:45.185687 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.185578 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:45.185687 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.185669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:45.186205 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:45.185839 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:45.186205 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:45.185852 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d77897766-fz4ms: secret "image-registry-tls" not found Apr 23 17:41:45.186205 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:45.185895 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls podName:5052b556-e070-4527-add1-23d7af9bfa63 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:53.185881344 +0000 UTC m=+49.264453304 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls") pod "image-registry-6d77897766-fz4ms" (UID: "5052b556-e070-4527-add1-23d7af9bfa63") : secret "image-registry-tls" not found Apr 23 17:41:45.186205 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:45.185943 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:45.186205 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:45.185962 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert podName:9840772f-9282-45fa-a2c3-d4bfeab937c3 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:53.185956436 +0000 UTC m=+49.264528392 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert") pod "ingress-canary-4txc2" (UID: "9840772f-9282-45fa-a2c3-d4bfeab937c3") : secret "canary-serving-cert" not found Apr 23 17:41:45.286190 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.286155 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:45.286337 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:45.286276 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:45.286377 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:45.286338 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls podName:245df2e7-7ac7-458b-a0e7-7f1121debc71 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:53.28631947 +0000 UTC m=+49.364891431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls") pod "dns-default-j4f7r" (UID: "245df2e7-7ac7-458b-a0e7-7f1121debc71") : secret "dns-default-metrics-tls" not found Apr 23 17:41:45.760538 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.760501 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" event={"ID":"0204dafa-5b79-44f7-b2f8-75ffc5549db4","Type":"ContainerStarted","Data":"6afda8156e89d6fb3e0a3d40106bd39d3b2c8d877aab238b0495ddee82cabd82"} Apr 23 17:41:45.763555 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.763521 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t79h9" event={"ID":"adddecdf-d1af-4726-baac-6b7ff1828f40","Type":"ContainerStarted","Data":"65976f7497b96d5041eb95ee6b0a533bdd44b9fb2d10d39bbeb66b2fad215e31"} Apr 23 17:41:45.764718 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.764695 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" event={"ID":"bd621488-e3c6-474f-8312-4fc8287e45e6","Type":"ContainerStarted","Data":"811e091888a65ed351aada668feb2a7fe4cfa51d80cf625c4c32951cbf283837"} Apr 23 17:41:45.764898 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.764875 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:45.765974 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.765955 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" event={"ID":"f7819252-f0cd-46a2-a9e8-b9a71ed7592a","Type":"ContainerStarted","Data":"8ebf39761eb0dfe31714ce961fe32044c1e30c21228552353d1699f42b7fd999"} Apr 23 17:41:45.766680 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.766663 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:41:45.776384 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.776346 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" podStartSLOduration=12.370217652000001 podStartE2EDuration="18.776332035s" podCreationTimestamp="2026-04-23 17:41:27 +0000 UTC" firstStartedPulling="2026-04-23 17:41:38.535590111 +0000 UTC m=+34.614162080" lastFinishedPulling="2026-04-23 17:41:44.941704492 +0000 UTC m=+41.020276463" observedRunningTime="2026-04-23 17:41:45.77558986 +0000 UTC m=+41.854161838" watchObservedRunningTime="2026-04-23 17:41:45.776332035 +0000 UTC m=+41.854904007" Apr 23 17:41:45.794020 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.793974 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t79h9" podStartSLOduration=8.948671973 podStartE2EDuration="41.79396154s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:41:05.713431537 +0000 UTC m=+1.792003494" lastFinishedPulling="2026-04-23 17:41:38.558721099 +0000 UTC m=+34.637293061" observedRunningTime="2026-04-23 17:41:45.792891413 +0000 UTC m=+41.871463391" watchObservedRunningTime="2026-04-23 17:41:45.79396154 +0000 UTC m=+41.872533518" Apr 23 17:41:45.812609 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:45.812564 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" podStartSLOduration=12.390710249 podStartE2EDuration="18.81255142s" podCreationTimestamp="2026-04-23 17:41:27 +0000 UTC" firstStartedPulling="2026-04-23 17:41:38.535503662 +0000 UTC m=+34.614075618" lastFinishedPulling="2026-04-23 17:41:44.957344818 +0000 UTC m=+41.035916789" observedRunningTime="2026-04-23 17:41:45.812511535 +0000 UTC m=+41.891083512" watchObservedRunningTime="2026-04-23 17:41:45.81255142 +0000 UTC m=+41.891123389" Apr 23 17:41:46.396173 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:46.396135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:46.399727 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:46.399703 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e92c79cf-5cbc-4aab-b574-cecf28cb3b0f-original-pull-secret\") pod \"global-pull-secret-syncer-hqlck\" (UID: \"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f\") " pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:46.500745 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:46.500709 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqlck" Apr 23 17:41:46.646708 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:46.646633 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hqlck"] Apr 23 17:41:46.649911 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:41:46.649886 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode92c79cf_5cbc_4aab_b574_cecf28cb3b0f.slice/crio-3613435115fe845e6a81a34f8348a1432a6fb945915fbc9d8971073cd6e9a62d WatchSource:0}: Error finding container 3613435115fe845e6a81a34f8348a1432a6fb945915fbc9d8971073cd6e9a62d: Status 404 returned error can't find the container with id 3613435115fe845e6a81a34f8348a1432a6fb945915fbc9d8971073cd6e9a62d Apr 23 17:41:46.769870 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:46.769832 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hqlck" event={"ID":"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f","Type":"ContainerStarted","Data":"3613435115fe845e6a81a34f8348a1432a6fb945915fbc9d8971073cd6e9a62d"} Apr 23 17:41:48.776400 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:48.776351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" event={"ID":"f7819252-f0cd-46a2-a9e8-b9a71ed7592a","Type":"ContainerStarted","Data":"89a3c95d734752bf13bf1fe28b25babaadc251ad066245126c0e3fca997c6fa6"} Apr 23 17:41:48.776400 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:48.776407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" event={"ID":"f7819252-f0cd-46a2-a9e8-b9a71ed7592a","Type":"ContainerStarted","Data":"91d4806f982e591de6aaa51bd6a2a256ffea20f3521289732c09823336ee6986"} Apr 23 17:41:48.792323 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:48.792268 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" podStartSLOduration=11.964615921 podStartE2EDuration="21.792248104s" podCreationTimestamp="2026-04-23 17:41:27 +0000 UTC" firstStartedPulling="2026-04-23 17:41:38.535374208 +0000 UTC m=+34.613946183" lastFinishedPulling="2026-04-23 17:41:48.363006407 +0000 UTC m=+44.441578366" observedRunningTime="2026-04-23 17:41:48.791385755 +0000 UTC m=+44.869957734" watchObservedRunningTime="2026-04-23 17:41:48.792248104 +0000 UTC m=+44.870820085" Apr 23 17:41:51.785454 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:51.785416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hqlck" event={"ID":"e92c79cf-5cbc-4aab-b574-cecf28cb3b0f","Type":"ContainerStarted","Data":"6bff28f5606322e2aadb45a1061134bc381acaf1c7120e37eff0bf546a52e6d3"} Apr 23 17:41:51.800501 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:51.800446 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hqlck" podStartSLOduration=33.760570948 podStartE2EDuration="37.80043132s" podCreationTimestamp="2026-04-23 17:41:14 +0000 UTC" firstStartedPulling="2026-04-23 17:41:46.652725087 +0000 UTC m=+42.731297044" lastFinishedPulling="2026-04-23 17:41:50.692585456 +0000 UTC m=+46.771157416" observedRunningTime="2026-04-23 17:41:51.799650179 +0000 UTC m=+47.878222172" watchObservedRunningTime="2026-04-23 17:41:51.80043132 +0000 UTC m=+47.879003297" Apr 23 17:41:53.247034 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:53.246988 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:41:53.247447 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:53.247096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:41:53.247447 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:53.247159 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:41:53.247447 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:53.247182 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d77897766-fz4ms: secret "image-registry-tls" not found Apr 23 17:41:53.247447 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:53.247243 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls podName:5052b556-e070-4527-add1-23d7af9bfa63 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:09.247226164 +0000 UTC m=+65.325798125 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls") pod "image-registry-6d77897766-fz4ms" (UID: "5052b556-e070-4527-add1-23d7af9bfa63") : secret "image-registry-tls" not found Apr 23 17:41:53.247447 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:53.247248 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:53.247447 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:53.247297 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert podName:9840772f-9282-45fa-a2c3-d4bfeab937c3 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:09.247284202 +0000 UTC m=+65.325856159 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert") pod "ingress-canary-4txc2" (UID: "9840772f-9282-45fa-a2c3-d4bfeab937c3") : secret "canary-serving-cert" not found Apr 23 17:41:53.347612 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:41:53.347566 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:41:53.347809 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:53.347676 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:53.347809 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:41:53.347730 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls podName:245df2e7-7ac7-458b-a0e7-7f1121debc71 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:09.347716117 +0000 UTC m=+65.426288072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls") pod "dns-default-j4f7r" (UID: "245df2e7-7ac7-458b-a0e7-7f1121debc71") : secret "dns-default-metrics-tls" not found Apr 23 17:42:03.730789 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:03.730747 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cqkbt" Apr 23 17:42:09.162454 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.162413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:42:09.165159 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.165139 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:42:09.175110 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.175083 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:42:09.186173 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.186145 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gshbt\" (UniqueName: \"kubernetes.io/projected/9c2a48b6-a8f6-4813-bb00-957aa2486e5e-kube-api-access-gshbt\") pod \"network-check-target-8rrlh\" (UID: \"9c2a48b6-a8f6-4813-bb00-957aa2486e5e\") " pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:42:09.263431 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.263387 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:42:09.263431 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.263429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:42:09.263641 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.263460 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:42:09.263641 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:09.263543 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:09.263641 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:09.263547 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:42:09.263641 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:09.263618 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert podName:9840772f-9282-45fa-a2c3-d4bfeab937c3 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:41.26360231 +0000 UTC m=+97.342174266 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert") pod "ingress-canary-4txc2" (UID: "9840772f-9282-45fa-a2c3-d4bfeab937c3") : secret "canary-serving-cert" not found Apr 23 17:42:09.263641 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:09.263626 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d77897766-fz4ms: secret "image-registry-tls" not found Apr 23 17:42:09.263896 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:09.263703 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls podName:5052b556-e070-4527-add1-23d7af9bfa63 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:41.26368812 +0000 UTC m=+97.342260076 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls") pod "image-registry-6d77897766-fz4ms" (UID: "5052b556-e070-4527-add1-23d7af9bfa63") : secret "image-registry-tls" not found Apr 23 17:42:09.265591 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.265575 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:42:09.274309 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:09.274294 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:42:09.274369 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:09.274358 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs podName:9d9e614d-f61d-4fcd-aaaf-ab97f54f2487 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:13.274343423 +0000 UTC m=+129.352915378 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs") pod "network-metrics-daemon-qzv7h" (UID: "9d9e614d-f61d-4fcd-aaaf-ab97f54f2487") : secret "metrics-daemon-secret" not found Apr 23 17:42:09.310718 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.310690 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kjp2q\"" Apr 23 17:42:09.319453 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.319429 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:42:09.364758 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.364716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:42:09.364930 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:09.364906 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:09.365004 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:09.364992 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls podName:245df2e7-7ac7-458b-a0e7-7f1121debc71 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:41.364970757 +0000 UTC m=+97.443542726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls") pod "dns-default-j4f7r" (UID: "245df2e7-7ac7-458b-a0e7-7f1121debc71") : secret "dns-default-metrics-tls" not found Apr 23 17:42:09.445716 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.445638 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8rrlh"] Apr 23 17:42:09.448710 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:42:09.448678 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c2a48b6_a8f6_4813_bb00_957aa2486e5e.slice/crio-9341dc2a8940065dce7e7df522ded0e51f3d02eee6e42176e679d5e9a3ff56ee WatchSource:0}: Error finding container 9341dc2a8940065dce7e7df522ded0e51f3d02eee6e42176e679d5e9a3ff56ee: Status 404 returned error can't find the container with id 9341dc2a8940065dce7e7df522ded0e51f3d02eee6e42176e679d5e9a3ff56ee Apr 23 17:42:09.832257 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:09.832216 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8rrlh" event={"ID":"9c2a48b6-a8f6-4813-bb00-957aa2486e5e","Type":"ContainerStarted","Data":"9341dc2a8940065dce7e7df522ded0e51f3d02eee6e42176e679d5e9a3ff56ee"} Apr 23 17:42:12.841353 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:12.841319 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8rrlh" event={"ID":"9c2a48b6-a8f6-4813-bb00-957aa2486e5e","Type":"ContainerStarted","Data":"b7690d92ccf8417a9fba289540dd993b0468e1b07b51f48bb0da21106828fec8"} Apr 23 17:42:13.843571 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:13.843540 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:42:13.856859 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:13.856812 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-8rrlh" podStartSLOduration=66.588476123 podStartE2EDuration="1m9.856795173s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:42:09.450682192 +0000 UTC m=+65.529254148" lastFinishedPulling="2026-04-23 17:42:12.719001229 +0000 UTC m=+68.797573198" observedRunningTime="2026-04-23 17:42:13.856317791 +0000 UTC m=+69.934889768" watchObservedRunningTime="2026-04-23 17:42:13.856795173 +0000 UTC m=+69.935367148" Apr 23 17:42:41.311185 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:41.311151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:42:41.311602 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:41.311216 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:42:41.311602 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:41.311310 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:41.311602 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:41.311356 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:42:41.311602 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:41.311375 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d77897766-fz4ms: secret "image-registry-tls" not found Apr 23 17:42:41.311602 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:41.311376 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert podName:9840772f-9282-45fa-a2c3-d4bfeab937c3 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:45.311359679 +0000 UTC m=+161.389931638 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert") pod "ingress-canary-4txc2" (UID: "9840772f-9282-45fa-a2c3-d4bfeab937c3") : secret "canary-serving-cert" not found Apr 23 17:42:41.311602 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:41.311424 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls podName:5052b556-e070-4527-add1-23d7af9bfa63 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:45.311411229 +0000 UTC m=+161.389983189 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls") pod "image-registry-6d77897766-fz4ms" (UID: "5052b556-e070-4527-add1-23d7af9bfa63") : secret "image-registry-tls" not found Apr 23 17:42:41.411977 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:41.411931 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:42:41.412157 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:41.412052 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:41.412157 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:42:41.412114 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls podName:245df2e7-7ac7-458b-a0e7-7f1121debc71 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:45.412100537 +0000 UTC m=+161.490672497 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls") pod "dns-default-j4f7r" (UID: "245df2e7-7ac7-458b-a0e7-7f1121debc71") : secret "dns-default-metrics-tls" not found Apr 23 17:42:44.848442 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:42:44.848405 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-8rrlh" Apr 23 17:43:13.355325 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:13.355279 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:43:13.355836 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:43:13.355434 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:43:13.355836 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:43:13.355512 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs podName:9d9e614d-f61d-4fcd-aaaf-ab97f54f2487 nodeName:}" failed. No retries permitted until 2026-04-23 17:45:15.355497192 +0000 UTC m=+251.434069148 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs") pod "network-metrics-daemon-qzv7h" (UID: "9d9e614d-f61d-4fcd-aaaf-ab97f54f2487") : secret "metrics-daemon-secret" not found Apr 23 17:43:17.145057 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:17.145030 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cxr7m_33946126-5bdc-4047-b09e-8ca68acdbd65/dns-node-resolver/0.log" Apr 23 17:43:18.550731 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:18.550701 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sdzf6_7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f/node-ca/0.log" Apr 23 17:43:40.373851 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:43:40.373808 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" Apr 23 17:43:40.440166 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:43:40.440131 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4txc2" podUID="9840772f-9282-45fa-a2c3-d4bfeab937c3" Apr 23 17:43:40.464437 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:43:40.464397 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-j4f7r" podUID="245df2e7-7ac7-458b-a0e7-7f1121debc71" Apr 23 17:43:40.514625 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:43:40.514580 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qzv7h" podUID="9d9e614d-f61d-4fcd-aaaf-ab97f54f2487" Apr 23 17:43:41.044498 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:41.044462 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:43:45.402282 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.402241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:43:45.402670 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.402293 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:43:45.404790 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.404751 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9840772f-9282-45fa-a2c3-d4bfeab937c3-cert\") pod \"ingress-canary-4txc2\" (UID: \"9840772f-9282-45fa-a2c3-d4bfeab937c3\") " pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:43:45.404889 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.404842 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"image-registry-6d77897766-fz4ms\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:43:45.502997 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.502957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:43:45.505522 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.505494 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/245df2e7-7ac7-458b-a0e7-7f1121debc71-metrics-tls\") pod \"dns-default-j4f7r\" (UID: \"245df2e7-7ac7-458b-a0e7-7f1121debc71\") " pod="openshift-dns/dns-default-j4f7r" Apr 23 17:43:45.547957 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.547922 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7hx6w\"" Apr 23 17:43:45.555854 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.555830 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:43:45.681696 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.681659 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d77897766-fz4ms"] Apr 23 17:43:45.685845 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:43:45.685810 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5052b556_e070_4527_add1_23d7af9bfa63.slice/crio-ac935d6a0dc7a7e5c3d213c18c1a3b0fc52027b66469f3f5563232250cc8482c WatchSource:0}: Error finding container ac935d6a0dc7a7e5c3d213c18c1a3b0fc52027b66469f3f5563232250cc8482c: Status 404 returned error can't find the container with id ac935d6a0dc7a7e5c3d213c18c1a3b0fc52027b66469f3f5563232250cc8482c Apr 23 17:43:45.766171 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:45.766065 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" podUID="bd621488-e3c6-474f-8312-4fc8287e45e6" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 23 17:43:46.057269 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.057192 2571 generic.go:358] "Generic (PLEG): container finished" podID="0204dafa-5b79-44f7-b2f8-75ffc5549db4" containerID="6afda8156e89d6fb3e0a3d40106bd39d3b2c8d877aab238b0495ddee82cabd82" exitCode=255 Apr 23 17:43:46.057495 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.057287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" event={"ID":"0204dafa-5b79-44f7-b2f8-75ffc5549db4","Type":"ContainerDied","Data":"6afda8156e89d6fb3e0a3d40106bd39d3b2c8d877aab238b0495ddee82cabd82"} Apr 23 17:43:46.057712 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.057693 2571 scope.go:117] "RemoveContainer" containerID="6afda8156e89d6fb3e0a3d40106bd39d3b2c8d877aab238b0495ddee82cabd82" Apr 23 17:43:46.058617 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.058596 2571 generic.go:358] "Generic (PLEG): container finished" podID="bd621488-e3c6-474f-8312-4fc8287e45e6" containerID="811e091888a65ed351aada668feb2a7fe4cfa51d80cf625c4c32951cbf283837" exitCode=1 Apr 23 17:43:46.058690 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.058670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" event={"ID":"bd621488-e3c6-474f-8312-4fc8287e45e6","Type":"ContainerDied","Data":"811e091888a65ed351aada668feb2a7fe4cfa51d80cf625c4c32951cbf283837"} Apr 23 17:43:46.059037 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.059003 2571 scope.go:117] "RemoveContainer" containerID="811e091888a65ed351aada668feb2a7fe4cfa51d80cf625c4c32951cbf283837" Apr 23 17:43:46.060192 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.060068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerStarted","Data":"a1d2df3370fc783941501d4f41c8e24735ca68d5f6f218645b791b937f852053"} Apr 23 17:43:46.060192 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.060099 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerStarted","Data":"ac935d6a0dc7a7e5c3d213c18c1a3b0fc52027b66469f3f5563232250cc8482c"} Apr 23 17:43:46.060313 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.060201 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:43:46.121609 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:46.121555 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podStartSLOduration=162.121535863 podStartE2EDuration="2m42.121535863s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:43:46.120405963 +0000 UTC m=+162.198977941" watchObservedRunningTime="2026-04-23 17:43:46.121535863 +0000 UTC m=+162.200107842" Apr 23 17:43:47.064451 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:47.064410 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" event={"ID":"0204dafa-5b79-44f7-b2f8-75ffc5549db4","Type":"ContainerStarted","Data":"c8cc8976415e42ecc735c50906cd4b7273fb126f53487c50eed83ce2aa5040c3"} Apr 23 17:43:47.065859 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:47.065835 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" event={"ID":"bd621488-e3c6-474f-8312-4fc8287e45e6","Type":"ContainerStarted","Data":"cf6a4f0a59dcd08fe40acbb8b9874cb28e94e28b8955775892b560397a459c06"} Apr 23 17:43:47.066206 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:47.066189 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:43:47.066746 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:47.066729 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:43:51.488511 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:51.488471 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j4f7r" Apr 23 17:43:51.491075 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:51.491053 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-b6vkm\"" Apr 23 17:43:51.499815 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:51.499792 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j4f7r" Apr 23 17:43:51.622750 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:51.622717 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j4f7r"] Apr 23 17:43:51.626219 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:43:51.626191 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod245df2e7_7ac7_458b_a0e7_7f1121debc71.slice/crio-88173f6ca46a662b74d341d231a423a973292e590c53cc2298906a100763ad56 WatchSource:0}: Error finding container 88173f6ca46a662b74d341d231a423a973292e590c53cc2298906a100763ad56: Status 404 returned error can't find the container with id 88173f6ca46a662b74d341d231a423a973292e590c53cc2298906a100763ad56 Apr 23 17:43:52.083259 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:52.083218 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j4f7r" event={"ID":"245df2e7-7ac7-458b-a0e7-7f1121debc71","Type":"ContainerStarted","Data":"88173f6ca46a662b74d341d231a423a973292e590c53cc2298906a100763ad56"} Apr 23 17:43:53.087109 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:53.087017 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j4f7r" event={"ID":"245df2e7-7ac7-458b-a0e7-7f1121debc71","Type":"ContainerStarted","Data":"de26f2243c4dc52e3d34bccc5d7758d20c400d62956bb18136795ed77e368fc6"} Apr 23 17:43:53.087109 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:53.087052 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j4f7r" event={"ID":"245df2e7-7ac7-458b-a0e7-7f1121debc71","Type":"ContainerStarted","Data":"6ca33d3dc4e67194b9abd48636f7946a5e955f0a941877a178c6366d53d2d776"} Apr 23 17:43:53.087485 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:53.087145 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-j4f7r" Apr 23 17:43:53.108584 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:53.108529 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j4f7r" podStartSLOduration=134.950993831 podStartE2EDuration="2m16.108515329s" podCreationTimestamp="2026-04-23 17:41:37 +0000 UTC" firstStartedPulling="2026-04-23 17:43:51.628383278 +0000 UTC m=+167.706955236" lastFinishedPulling="2026-04-23 17:43:52.785904772 +0000 UTC m=+168.864476734" observedRunningTime="2026-04-23 17:43:53.107654981 +0000 UTC m=+169.186226982" watchObservedRunningTime="2026-04-23 17:43:53.108515329 +0000 UTC m=+169.187087306" Apr 23 17:43:55.488368 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:55.488322 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:43:55.488852 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:55.488322 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:43:55.490997 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:55.490980 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bmxrf\"" Apr 23 17:43:55.499320 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:55.499298 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4txc2" Apr 23 17:43:55.620377 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:55.620344 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4txc2"] Apr 23 17:43:55.624725 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:43:55.624676 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9840772f_9282_45fa_a2c3_d4bfeab937c3.slice/crio-53eb07365fcfcefe7024e7b9b5c8f2571f8813f483bd5b0ebf09de7d1a350e52 WatchSource:0}: Error finding container 53eb07365fcfcefe7024e7b9b5c8f2571f8813f483bd5b0ebf09de7d1a350e52: Status 404 returned error can't find the container with id 53eb07365fcfcefe7024e7b9b5c8f2571f8813f483bd5b0ebf09de7d1a350e52 Apr 23 17:43:56.096817 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:56.096758 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4txc2" event={"ID":"9840772f-9282-45fa-a2c3-d4bfeab937c3","Type":"ContainerStarted","Data":"53eb07365fcfcefe7024e7b9b5c8f2571f8813f483bd5b0ebf09de7d1a350e52"} Apr 23 17:43:58.102832 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:58.102791 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4txc2" event={"ID":"9840772f-9282-45fa-a2c3-d4bfeab937c3","Type":"ContainerStarted","Data":"a97a752e203122615d2280ffd7893270b7d5b952d2609b541985ef079f774c96"} Apr 23 17:43:58.122740 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:43:58.122694 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4txc2" podStartSLOduration=139.680236978 podStartE2EDuration="2m21.122680351s" podCreationTimestamp="2026-04-23 17:41:37 +0000 UTC" firstStartedPulling="2026-04-23 17:43:55.626244272 +0000 UTC m=+171.704816228" lastFinishedPulling="2026-04-23 17:43:57.06868764 +0000 UTC m=+173.147259601" observedRunningTime="2026-04-23 17:43:58.121531817 +0000 UTC m=+174.200103795" watchObservedRunningTime="2026-04-23 17:43:58.122680351 +0000 UTC m=+174.201252329" Apr 23 17:44:03.092244 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:03.092211 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j4f7r" Apr 23 17:44:05.560228 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:05.560187 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:05.560673 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:05.560278 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:07.069982 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:07.069948 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:07.070327 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:07.070010 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:07.710289 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:07.710246 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" podUID="f7819252-f0cd-46a2-a9e8-b9a71ed7592a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 17:44:15.559687 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:15.559648 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:15.560161 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:15.559731 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:17.070353 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:17.070314 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:17.070710 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:17.070380 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:17.709795 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:17.709735 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" podUID="f7819252-f0cd-46a2-a9e8-b9a71ed7592a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 17:44:25.560225 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:25.560189 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:25.560591 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:25.560245 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:25.560591 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:25.560284 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:44:25.560749 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:25.560714 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"a1d2df3370fc783941501d4f41c8e24735ca68d5f6f218645b791b937f852053"} pod="openshift-image-registry/image-registry-6d77897766-fz4ms" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:44:25.563980 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:25.563943 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:25.564096 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:25.564002 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:27.710415 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:27.710374 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" podUID="f7819252-f0cd-46a2-a9e8-b9a71ed7592a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 17:44:27.710787 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:27.710457 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" Apr 23 17:44:27.710936 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:27.710918 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"89a3c95d734752bf13bf1fe28b25babaadc251ad066245126c0e3fca997c6fa6"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 17:44:27.710973 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:27.710959 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" podUID="f7819252-f0cd-46a2-a9e8-b9a71ed7592a" containerName="service-proxy" containerID="cri-o://89a3c95d734752bf13bf1fe28b25babaadc251ad066245126c0e3fca997c6fa6" gracePeriod=30 Apr 23 17:44:28.179542 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:28.179505 2571 generic.go:358] "Generic (PLEG): container finished" podID="f7819252-f0cd-46a2-a9e8-b9a71ed7592a" containerID="89a3c95d734752bf13bf1fe28b25babaadc251ad066245126c0e3fca997c6fa6" exitCode=2 Apr 23 17:44:28.179716 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:28.179573 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" event={"ID":"f7819252-f0cd-46a2-a9e8-b9a71ed7592a","Type":"ContainerDied","Data":"89a3c95d734752bf13bf1fe28b25babaadc251ad066245126c0e3fca997c6fa6"} Apr 23 17:44:28.179716 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:28.179611 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8f68778c8-gnsjb" event={"ID":"f7819252-f0cd-46a2-a9e8-b9a71ed7592a","Type":"ContainerStarted","Data":"42b29b1898f750231a311b56148ee9a7762b8fea6ce9a54c9d3fdb62fac9ef38"} Apr 23 17:44:35.564789 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:35.564732 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:35.565170 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:35.564813 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:41.281528 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:41.281491 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j4f7r_245df2e7-7ac7-458b-a0e7-7f1121debc71/dns/0.log" Apr 23 17:44:41.467255 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:41.467179 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j4f7r_245df2e7-7ac7-458b-a0e7-7f1121debc71/kube-rbac-proxy/0.log" Apr 23 17:44:42.065386 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:42.065350 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cxr7m_33946126-5bdc-4047-b09e-8ca68acdbd65/dns-node-resolver/0.log" Apr 23 17:44:42.665307 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:42.665283 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d77897766-fz4ms_5052b556-e070-4527-add1-23d7af9bfa63/registry/0.log" Apr 23 17:44:43.670168 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:43.670135 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sdzf6_7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f/node-ca/0.log" Apr 23 17:44:44.067480 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:44.067454 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4txc2_9840772f-9282-45fa-a2c3-d4bfeab937c3/serve-healthcheck-canary/0.log" Apr 23 17:44:45.565123 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:45.565089 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:45.565530 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:45.565141 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:50.579359 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:50.579312 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" containerID="cri-o://a1d2df3370fc783941501d4f41c8e24735ca68d5f6f218645b791b937f852053" gracePeriod=30 Apr 23 17:44:51.240525 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:51.240481 2571 generic.go:358] "Generic (PLEG): container finished" podID="5052b556-e070-4527-add1-23d7af9bfa63" containerID="a1d2df3370fc783941501d4f41c8e24735ca68d5f6f218645b791b937f852053" exitCode=0 Apr 23 17:44:51.240700 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:51.240567 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerDied","Data":"a1d2df3370fc783941501d4f41c8e24735ca68d5f6f218645b791b937f852053"} Apr 23 17:44:51.240700 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:51.240605 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerStarted","Data":"1e37dfc4721d7d9cb08119e0b7e13f69d0ab799fe9887b4e5d3a70dd09fd56a9"} Apr 23 17:44:51.240700 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:44:51.240652 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:45:05.560628 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:05.560594 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:05.561031 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:05.560643 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:12.248605 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:12.248563 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:12.248995 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:12.248622 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:15.438897 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:15.438840 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:45:15.441283 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:15.441264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d9e614d-f61d-4fcd-aaaf-ab97f54f2487-metrics-certs\") pod \"network-metrics-daemon-qzv7h\" (UID: \"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487\") " pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:45:15.560384 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:15.560352 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:15.560552 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:15.560410 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:15.591517 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:15.591486 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qk5s8\"" Apr 23 17:45:15.599313 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:15.599289 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzv7h" Apr 23 17:45:15.721749 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:15.721673 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qzv7h"] Apr 23 17:45:15.724814 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:45:15.724783 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d9e614d_f61d_4fcd_aaaf_ab97f54f2487.slice/crio-1fe09458ff80d1374a311805be9517969b1c8cc45cd43470555d06b6cc8732aa WatchSource:0}: Error finding container 1fe09458ff80d1374a311805be9517969b1c8cc45cd43470555d06b6cc8732aa: Status 404 returned error can't find the container with id 1fe09458ff80d1374a311805be9517969b1c8cc45cd43470555d06b6cc8732aa Apr 23 17:45:16.301881 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:16.301842 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qzv7h" event={"ID":"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487","Type":"ContainerStarted","Data":"1fe09458ff80d1374a311805be9517969b1c8cc45cd43470555d06b6cc8732aa"} Apr 23 17:45:17.306181 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:17.306148 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qzv7h" event={"ID":"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487","Type":"ContainerStarted","Data":"0d23739d719ec183ccfd7eacf73a4f04f247996e13daecedf151d512f72b0505"} Apr 23 17:45:17.306181 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:17.306184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qzv7h" event={"ID":"9d9e614d-f61d-4fcd-aaaf-ab97f54f2487","Type":"ContainerStarted","Data":"b288fab44c152e3ce8108d7d494f25e1e8e9d70a1439484b23e3ae6e55589e79"} Apr 23 17:45:17.323482 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:17.323439 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qzv7h" podStartSLOduration=252.386858161 podStartE2EDuration="4m13.323423898s" podCreationTimestamp="2026-04-23 17:41:04 +0000 UTC" firstStartedPulling="2026-04-23 17:45:15.726792634 +0000 UTC m=+251.805364596" lastFinishedPulling="2026-04-23 17:45:16.663358374 +0000 UTC m=+252.741930333" observedRunningTime="2026-04-23 17:45:17.322494008 +0000 UTC m=+253.401065987" watchObservedRunningTime="2026-04-23 17:45:17.323423898 +0000 UTC m=+253.401995876" Apr 23 17:45:22.247412 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:22.247376 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:22.247893 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:22.247429 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:25.559888 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:25.559810 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:25.560292 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:25.559881 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:25.560292 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:25.559930 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:45:25.560423 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:25.560348 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"1e37dfc4721d7d9cb08119e0b7e13f69d0ab799fe9887b4e5d3a70dd09fd56a9"} pod="openshift-image-registry/image-registry-6d77897766-fz4ms" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:45:25.563806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:25.563734 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:25.563806 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:25.563801 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:35.564100 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:35.564064 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:35.564469 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:35.564120 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:45.564356 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:45.564321 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:45.564726 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:45.564375 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:46.381703 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:46.381666 2571 generic.go:358] "Generic (PLEG): container finished" podID="bd621488-e3c6-474f-8312-4fc8287e45e6" containerID="cf6a4f0a59dcd08fe40acbb8b9874cb28e94e28b8955775892b560397a459c06" exitCode=1 Apr 23 17:45:46.381900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:46.381736 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" event={"ID":"bd621488-e3c6-474f-8312-4fc8287e45e6","Type":"ContainerDied","Data":"cf6a4f0a59dcd08fe40acbb8b9874cb28e94e28b8955775892b560397a459c06"} Apr 23 17:45:46.381900 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:46.381793 2571 scope.go:117] "RemoveContainer" containerID="811e091888a65ed351aada668feb2a7fe4cfa51d80cf625c4c32951cbf283837" Apr 23 17:45:46.382162 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:46.382141 2571 scope.go:117] "RemoveContainer" containerID="cf6a4f0a59dcd08fe40acbb8b9874cb28e94e28b8955775892b560397a459c06" Apr 23 17:45:46.382383 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:45:46.382353 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=acm-agent pod=klusterlet-addon-workmgr-77df67f946-r5dl4_open-cluster-management-agent-addon(bd621488-e3c6-474f-8312-4fc8287e45e6)\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" podUID="bd621488-e3c6-474f-8312-4fc8287e45e6" Apr 23 17:45:46.383380 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:46.383362 2571 generic.go:358] "Generic (PLEG): container finished" podID="0204dafa-5b79-44f7-b2f8-75ffc5549db4" containerID="c8cc8976415e42ecc735c50906cd4b7273fb126f53487c50eed83ce2aa5040c3" exitCode=255 Apr 23 17:45:46.383476 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:46.383393 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" event={"ID":"0204dafa-5b79-44f7-b2f8-75ffc5549db4","Type":"ContainerDied","Data":"c8cc8976415e42ecc735c50906cd4b7273fb126f53487c50eed83ce2aa5040c3"} Apr 23 17:45:46.383710 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:46.383692 2571 scope.go:117] "RemoveContainer" containerID="c8cc8976415e42ecc735c50906cd4b7273fb126f53487c50eed83ce2aa5040c3" Apr 23 17:45:46.383898 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:45:46.383880 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=addon-agent pod=managed-serviceaccount-addon-agent-ccd7b688f-jp4jp_open-cluster-management-agent-addon(0204dafa-5b79-44f7-b2f8-75ffc5549db4)\"" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" podUID="0204dafa-5b79-44f7-b2f8-75ffc5549db4" Apr 23 17:45:46.392949 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:46.392934 2571 scope.go:117] "RemoveContainer" containerID="6afda8156e89d6fb3e0a3d40106bd39d3b2c8d877aab238b0495ddee82cabd82" Apr 23 17:45:47.067055 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:47.067012 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:45:47.388495 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:47.388415 2571 scope.go:117] "RemoveContainer" containerID="cf6a4f0a59dcd08fe40acbb8b9874cb28e94e28b8955775892b560397a459c06" Apr 23 17:45:47.388638 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:45:47.388617 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=acm-agent pod=klusterlet-addon-workmgr-77df67f946-r5dl4_open-cluster-management-agent-addon(bd621488-e3c6-474f-8312-4fc8287e45e6)\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" podUID="bd621488-e3c6-474f-8312-4fc8287e45e6" Apr 23 17:45:47.666446 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:47.666339 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" Apr 23 17:45:47.666713 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:47.666699 2571 scope.go:117] "RemoveContainer" containerID="c8cc8976415e42ecc735c50906cd4b7273fb126f53487c50eed83ce2aa5040c3" Apr 23 17:45:47.666930 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:45:47.666912 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=addon-agent pod=managed-serviceaccount-addon-agent-ccd7b688f-jp4jp_open-cluster-management-agent-addon(0204dafa-5b79-44f7-b2f8-75ffc5549db4)\"" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" podUID="0204dafa-5b79-44f7-b2f8-75ffc5549db4" Apr 23 17:45:47.688598 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:47.688564 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:45:48.391013 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:48.390985 2571 scope.go:117] "RemoveContainer" containerID="cf6a4f0a59dcd08fe40acbb8b9874cb28e94e28b8955775892b560397a459c06" Apr 23 17:45:48.391396 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:45:48.391162 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=acm-agent pod=klusterlet-addon-workmgr-77df67f946-r5dl4_open-cluster-management-agent-addon(bd621488-e3c6-474f-8312-4fc8287e45e6)\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" podUID="bd621488-e3c6-474f-8312-4fc8287e45e6" Apr 23 17:45:50.578671 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:50.578635 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" containerID="cri-o://1e37dfc4721d7d9cb08119e0b7e13f69d0ab799fe9887b4e5d3a70dd09fd56a9" gracePeriod=30 Apr 23 17:45:51.399825 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:51.399786 2571 generic.go:358] "Generic (PLEG): container finished" podID="5052b556-e070-4527-add1-23d7af9bfa63" containerID="1e37dfc4721d7d9cb08119e0b7e13f69d0ab799fe9887b4e5d3a70dd09fd56a9" exitCode=0 Apr 23 17:45:51.400008 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:51.399842 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerDied","Data":"1e37dfc4721d7d9cb08119e0b7e13f69d0ab799fe9887b4e5d3a70dd09fd56a9"} Apr 23 17:45:51.400008 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:51.399880 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerStarted","Data":"2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265"} Apr 23 17:45:51.400008 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:51.399896 2571 scope.go:117] "RemoveContainer" containerID="a1d2df3370fc783941501d4f41c8e24735ca68d5f6f218645b791b937f852053" Apr 23 17:45:51.400120 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:45:51.400097 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:46:00.488681 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:00.488641 2571 scope.go:117] "RemoveContainer" containerID="cf6a4f0a59dcd08fe40acbb8b9874cb28e94e28b8955775892b560397a459c06" Apr 23 17:46:01.428136 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:01.428103 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" event={"ID":"bd621488-e3c6-474f-8312-4fc8287e45e6","Type":"ContainerStarted","Data":"87e7b15bfc6d3cc64c88f8ba39393ec21c1ed65bcfdd7c4e13e2f1d1831eae6a"} Apr 23 17:46:01.428401 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:01.428381 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:46:01.429888 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:01.429864 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77df67f946-r5dl4" Apr 23 17:46:02.489823 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:02.489502 2571 scope.go:117] "RemoveContainer" containerID="c8cc8976415e42ecc735c50906cd4b7273fb126f53487c50eed83ce2aa5040c3" Apr 23 17:46:03.435162 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:03.435128 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-ccd7b688f-jp4jp" event={"ID":"0204dafa-5b79-44f7-b2f8-75ffc5549db4","Type":"ContainerStarted","Data":"6f267d432c200fa42f2e0fd8c9e4e020ea4d774f8b63f29041c156c706991671"} Apr 23 17:46:04.362555 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:04.362528 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 17:46:04.363030 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:04.362699 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 17:46:04.368044 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:04.368019 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 17:46:05.560353 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:05.560317 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:05.562955 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:05.560373 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:12.407825 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:12.407753 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:12.408212 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:12.407857 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:15.559753 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:15.559704 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:15.560153 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:15.559789 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:22.408025 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:22.407990 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:22.408419 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:22.408044 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:25.559639 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:25.559597 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:25.560019 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:25.559654 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:25.560019 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:25.559700 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:46:25.560279 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:25.560253 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265"} pod="openshift-image-registry/image-registry-6d77897766-fz4ms" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:46:25.563661 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:25.563632 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:25.563818 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:25.563685 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:35.564157 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:35.564123 2571 patch_prober.go:28] interesting pod/image-registry-6d77897766-fz4ms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:35.564617 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:35.564176 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:37.507402 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.507371 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4jc8x"] Apr 23 17:46:37.510485 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.510467 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.512748 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.512726 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 17:46:37.513959 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.513940 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 17:46:37.514091 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.514072 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 17:46:37.520223 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.520199 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dt5dv\"" Apr 23 17:46:37.520223 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.520215 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 17:46:37.532451 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.532425 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4jc8x"] Apr 23 17:46:37.592868 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.592837 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f58777e7-8e0a-42a0-ae61-087a5301da18-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.593051 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.592874 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f58777e7-8e0a-42a0-ae61-087a5301da18-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.593051 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.592902 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f58777e7-8e0a-42a0-ae61-087a5301da18-crio-socket\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.593051 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.592960 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f58777e7-8e0a-42a0-ae61-087a5301da18-data-volume\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.593051 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.592992 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skf2b\" (UniqueName: \"kubernetes.io/projected/f58777e7-8e0a-42a0-ae61-087a5301da18-kube-api-access-skf2b\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.694030 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.693996 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f58777e7-8e0a-42a0-ae61-087a5301da18-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.694207 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.694037 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f58777e7-8e0a-42a0-ae61-087a5301da18-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.694207 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.694056 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f58777e7-8e0a-42a0-ae61-087a5301da18-crio-socket\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.694207 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.694074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f58777e7-8e0a-42a0-ae61-087a5301da18-data-volume\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.694207 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.694113 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skf2b\" (UniqueName: \"kubernetes.io/projected/f58777e7-8e0a-42a0-ae61-087a5301da18-kube-api-access-skf2b\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.694207 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.694153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f58777e7-8e0a-42a0-ae61-087a5301da18-crio-socket\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.694678 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.694647 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f58777e7-8e0a-42a0-ae61-087a5301da18-data-volume\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.695058 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.695029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f58777e7-8e0a-42a0-ae61-087a5301da18-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.697491 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.697461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f58777e7-8e0a-42a0-ae61-087a5301da18-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.714386 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.714354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skf2b\" (UniqueName: \"kubernetes.io/projected/f58777e7-8e0a-42a0-ae61-087a5301da18-kube-api-access-skf2b\") pod \"insights-runtime-extractor-4jc8x\" (UID: \"f58777e7-8e0a-42a0-ae61-087a5301da18\") " pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.819664 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.819570 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4jc8x" Apr 23 17:46:37.963692 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.963657 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4jc8x"] Apr 23 17:46:37.967249 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:46:37.967220 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf58777e7_8e0a_42a0_ae61_087a5301da18.slice/crio-6bc7a9be63cf745238972db9c6063b5a5b2bd0b2903405d69085c38d372afe17 WatchSource:0}: Error finding container 6bc7a9be63cf745238972db9c6063b5a5b2bd0b2903405d69085c38d372afe17: Status 404 returned error can't find the container with id 6bc7a9be63cf745238972db9c6063b5a5b2bd0b2903405d69085c38d372afe17 Apr 23 17:46:37.969074 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:37.969059 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:46:38.522525 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:38.522489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4jc8x" event={"ID":"f58777e7-8e0a-42a0-ae61-087a5301da18","Type":"ContainerStarted","Data":"e1d0a8444cdaaab34ac15289d2c367635d39ece46cb3b97c08abaca5530e6992"} Apr 23 17:46:38.522525 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:38.522527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4jc8x" event={"ID":"f58777e7-8e0a-42a0-ae61-087a5301da18","Type":"ContainerStarted","Data":"6bc7a9be63cf745238972db9c6063b5a5b2bd0b2903405d69085c38d372afe17"} Apr 23 17:46:39.527285 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:39.527230 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4jc8x" event={"ID":"f58777e7-8e0a-42a0-ae61-087a5301da18","Type":"ContainerStarted","Data":"fe1baabc7142d0e1e51b97187e6282c9a5a2d6a6c59af7b097770cd72ffb58b3"} Apr 23 17:46:40.531841 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:40.531801 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4jc8x" event={"ID":"f58777e7-8e0a-42a0-ae61-087a5301da18","Type":"ContainerStarted","Data":"7b32cffa25c2ff84c585858f0879e7bbb48ad256bb91e5fd85060b2a13e05c07"} Apr 23 17:46:40.554725 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:40.554671 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4jc8x" podStartSLOduration=1.563322371 podStartE2EDuration="3.554654959s" podCreationTimestamp="2026-04-23 17:46:37 +0000 UTC" firstStartedPulling="2026-04-23 17:46:38.021928207 +0000 UTC m=+334.100500163" lastFinishedPulling="2026-04-23 17:46:40.01326078 +0000 UTC m=+336.091832751" observedRunningTime="2026-04-23 17:46:40.553950801 +0000 UTC m=+336.632522783" watchObservedRunningTime="2026-04-23 17:46:40.554654959 +0000 UTC m=+336.633226952" Apr 23 17:46:45.132786 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.132733 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-s9n68"] Apr 23 17:46:45.136461 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.136437 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.138844 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.138822 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 17:46:45.139283 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.139254 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 17:46:45.139283 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.139270 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 17:46:45.139451 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.139345 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 17:46:45.139642 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.139625 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fbl2d\"" Apr 23 17:46:45.143698 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.143682 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 17:46:45.143810 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.143795 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 17:46:45.150143 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.150121 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37834963-67dc-4f7f-b7c1-31adea238b05-sys\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.150234 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.150191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-textfile\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.150350 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.150335 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-wtmp\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.150408 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.150391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.150444 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.150427 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-tls\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.150483 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.150448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxq8\" (UniqueName: \"kubernetes.io/projected/37834963-67dc-4f7f-b7c1-31adea238b05-kube-api-access-7cxq8\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.150520 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.150496 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/37834963-67dc-4f7f-b7c1-31adea238b05-root\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.150561 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.150521 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37834963-67dc-4f7f-b7c1-31adea238b05-metrics-client-ca\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.150561 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.150536 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-accelerators-collector-config\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251483 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37834963-67dc-4f7f-b7c1-31adea238b05-sys\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251483 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-textfile\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251699 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-wtmp\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251699 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-wtmp\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251699 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251632 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37834963-67dc-4f7f-b7c1-31adea238b05-sys\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251699 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251870 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-tls\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251870 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxq8\" (UniqueName: \"kubernetes.io/projected/37834963-67dc-4f7f-b7c1-31adea238b05-kube-api-access-7cxq8\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251870 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/37834963-67dc-4f7f-b7c1-31adea238b05-root\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251870 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-textfile\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251870 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37834963-67dc-4f7f-b7c1-31adea238b05-metrics-client-ca\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.251870 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:46:45.251856 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 17:46:45.252101 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251828 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/37834963-67dc-4f7f-b7c1-31adea238b05-root\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.252101 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.251874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-accelerators-collector-config\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.252101 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:46:45.251925 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-tls podName:37834963-67dc-4f7f-b7c1-31adea238b05 nodeName:}" failed. No retries permitted until 2026-04-23 17:46:45.751906413 +0000 UTC m=+341.830478370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-tls") pod "node-exporter-s9n68" (UID: "37834963-67dc-4f7f-b7c1-31adea238b05") : secret "node-exporter-tls" not found Apr 23 17:46:45.252356 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.252330 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37834963-67dc-4f7f-b7c1-31adea238b05-metrics-client-ca\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.252470 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.252406 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-accelerators-collector-config\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.254167 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.254152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.311658 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.311625 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxq8\" (UniqueName: \"kubernetes.io/projected/37834963-67dc-4f7f-b7c1-31adea238b05-kube-api-access-7cxq8\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.563887 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.563854 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:46:45.755493 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.755451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-tls\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:45.757903 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:45.757875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/37834963-67dc-4f7f-b7c1-31adea238b05-node-exporter-tls\") pod \"node-exporter-s9n68\" (UID: \"37834963-67dc-4f7f-b7c1-31adea238b05\") " pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:46.045953 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:46.045910 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s9n68" Apr 23 17:46:46.054144 ip-10-0-138-17 kubenswrapper[2571]: W0423 17:46:46.054115 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37834963_67dc_4f7f_b7c1_31adea238b05.slice/crio-994b82b7f86a2f3cc6a821f91ace18166e5d688f808d496444512397b9b381de WatchSource:0}: Error finding container 994b82b7f86a2f3cc6a821f91ace18166e5d688f808d496444512397b9b381de: Status 404 returned error can't find the container with id 994b82b7f86a2f3cc6a821f91ace18166e5d688f808d496444512397b9b381de Apr 23 17:46:46.548218 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:46.548180 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s9n68" event={"ID":"37834963-67dc-4f7f-b7c1-31adea238b05","Type":"ContainerStarted","Data":"994b82b7f86a2f3cc6a821f91ace18166e5d688f808d496444512397b9b381de"} Apr 23 17:46:47.552118 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:47.552079 2571 generic.go:358] "Generic (PLEG): container finished" podID="37834963-67dc-4f7f-b7c1-31adea238b05" containerID="a1dcc02e6a0e320db8014a1e9a6e79361f8ea37b4dd515b4252bf0273296d630" exitCode=0 Apr 23 17:46:47.552524 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:47.552155 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s9n68" event={"ID":"37834963-67dc-4f7f-b7c1-31adea238b05","Type":"ContainerDied","Data":"a1dcc02e6a0e320db8014a1e9a6e79361f8ea37b4dd515b4252bf0273296d630"} Apr 23 17:46:48.556938 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:48.556901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s9n68" event={"ID":"37834963-67dc-4f7f-b7c1-31adea238b05","Type":"ContainerStarted","Data":"9fdaf2b6e234e43477d63dcd80568dbec898508aa18acac7fb9f2dbdd490bfc3"} Apr 23 17:46:48.556938 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:48.556937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s9n68" event={"ID":"37834963-67dc-4f7f-b7c1-31adea238b05","Type":"ContainerStarted","Data":"ef8a7b287e5add4198f8bb09b17d211f1c51788c3d3c62ea1367d4c558ff6717"} Apr 23 17:46:48.590252 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:48.590081 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-s9n68" podStartSLOduration=2.980148573 podStartE2EDuration="3.590064771s" podCreationTimestamp="2026-04-23 17:46:45 +0000 UTC" firstStartedPulling="2026-04-23 17:46:46.055662263 +0000 UTC m=+342.134234219" lastFinishedPulling="2026-04-23 17:46:46.665578461 +0000 UTC m=+342.744150417" observedRunningTime="2026-04-23 17:46:48.589119952 +0000 UTC m=+344.667691930" watchObservedRunningTime="2026-04-23 17:46:48.590064771 +0000 UTC m=+344.668636749" Apr 23 17:46:50.578601 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:50.578557 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" containerID="cri-o://2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265" gracePeriod=30 Apr 23 17:46:51.567457 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:51.567419 2571 generic.go:358] "Generic (PLEG): container finished" podID="5052b556-e070-4527-add1-23d7af9bfa63" containerID="2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265" exitCode=0 Apr 23 17:46:51.567650 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:51.567481 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerDied","Data":"2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265"} Apr 23 17:46:51.567650 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:51.567507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerStarted","Data":"1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e"} Apr 23 17:46:51.567650 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:51.567522 2571 scope.go:117] "RemoveContainer" containerID="1e37dfc4721d7d9cb08119e0b7e13f69d0ab799fe9887b4e5d3a70dd09fd56a9" Apr 23 17:46:51.567650 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:46:51.567584 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:47:00.145524 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:00.145475 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d77897766-fz4ms"] Apr 23 17:47:10.150449 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:10.150421 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:47:25.165520 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.165456 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" containerID="cri-o://1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e" gracePeriod=30 Apr 23 17:47:25.399184 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.399161 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:47:25.547599 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.547563 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-registry-certificates\") pod \"5052b556-e070-4527-add1-23d7af9bfa63\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " Apr 23 17:47:25.547800 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.547616 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-image-registry-private-configuration\") pod \"5052b556-e070-4527-add1-23d7af9bfa63\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " Apr 23 17:47:25.547800 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.547646 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") pod \"5052b556-e070-4527-add1-23d7af9bfa63\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " Apr 23 17:47:25.547800 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.547670 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-bound-sa-token\") pod \"5052b556-e070-4527-add1-23d7af9bfa63\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " Apr 23 17:47:25.547800 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.547695 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5052b556-e070-4527-add1-23d7af9bfa63-ca-trust-extracted\") pod \"5052b556-e070-4527-add1-23d7af9bfa63\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " Apr 23 17:47:25.547800 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.547722 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-trusted-ca\") pod \"5052b556-e070-4527-add1-23d7af9bfa63\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " Apr 23 17:47:25.548053 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.547801 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-installation-pull-secrets\") pod \"5052b556-e070-4527-add1-23d7af9bfa63\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " Apr 23 17:47:25.548053 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.547833 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7kk8\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-kube-api-access-q7kk8\") pod \"5052b556-e070-4527-add1-23d7af9bfa63\" (UID: \"5052b556-e070-4527-add1-23d7af9bfa63\") " Apr 23 17:47:25.548256 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.548196 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5052b556-e070-4527-add1-23d7af9bfa63" (UID: "5052b556-e070-4527-add1-23d7af9bfa63"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:25.548642 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.548585 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5052b556-e070-4527-add1-23d7af9bfa63" (UID: "5052b556-e070-4527-add1-23d7af9bfa63"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:25.550284 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.550250 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5052b556-e070-4527-add1-23d7af9bfa63" (UID: "5052b556-e070-4527-add1-23d7af9bfa63"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:25.550284 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.550258 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5052b556-e070-4527-add1-23d7af9bfa63" (UID: "5052b556-e070-4527-add1-23d7af9bfa63"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:25.550608 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.550585 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5052b556-e070-4527-add1-23d7af9bfa63" (UID: "5052b556-e070-4527-add1-23d7af9bfa63"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:25.550679 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.550612 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5052b556-e070-4527-add1-23d7af9bfa63" (UID: "5052b556-e070-4527-add1-23d7af9bfa63"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:25.550715 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.550684 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-kube-api-access-q7kk8" (OuterVolumeSpecName: "kube-api-access-q7kk8") pod "5052b556-e070-4527-add1-23d7af9bfa63" (UID: "5052b556-e070-4527-add1-23d7af9bfa63"). InnerVolumeSpecName "kube-api-access-q7kk8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:25.557245 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.557220 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5052b556-e070-4527-add1-23d7af9bfa63-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5052b556-e070-4527-add1-23d7af9bfa63" (UID: "5052b556-e070-4527-add1-23d7af9bfa63"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:47:25.649272 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.649232 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-image-registry-private-configuration\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:47:25.649272 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.649266 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-registry-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:47:25.649272 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.649276 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-bound-sa-token\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:47:25.649488 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.649285 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5052b556-e070-4527-add1-23d7af9bfa63-ca-trust-extracted\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:47:25.649488 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.649295 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-trusted-ca\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:47:25.649488 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.649303 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5052b556-e070-4527-add1-23d7af9bfa63-installation-pull-secrets\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:47:25.649488 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.649323 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q7kk8\" (UniqueName: \"kubernetes.io/projected/5052b556-e070-4527-add1-23d7af9bfa63-kube-api-access-q7kk8\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:47:25.649488 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.649332 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5052b556-e070-4527-add1-23d7af9bfa63-registry-certificates\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 17:47:25.657664 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.657634 2571 generic.go:358] "Generic (PLEG): container finished" podID="5052b556-e070-4527-add1-23d7af9bfa63" containerID="1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e" exitCode=0 Apr 23 17:47:25.657820 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.657697 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" Apr 23 17:47:25.657820 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.657728 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerDied","Data":"1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e"} Apr 23 17:47:25.657820 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.657784 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d77897766-fz4ms" event={"ID":"5052b556-e070-4527-add1-23d7af9bfa63","Type":"ContainerDied","Data":"ac935d6a0dc7a7e5c3d213c18c1a3b0fc52027b66469f3f5563232250cc8482c"} Apr 23 17:47:25.657820 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.657815 2571 scope.go:117] "RemoveContainer" containerID="1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e" Apr 23 17:47:25.665984 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.665968 2571 scope.go:117] "RemoveContainer" containerID="2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265" Apr 23 17:47:25.673021 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.673004 2571 scope.go:117] "RemoveContainer" containerID="1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e" Apr 23 17:47:25.673277 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:47:25.673257 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e\": container with ID starting with 1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e not found: ID does not exist" containerID="1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e" Apr 23 17:47:25.673326 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.673285 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e"} err="failed to get container status \"1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e\": rpc error: code = NotFound desc = could not find container \"1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e\": container with ID starting with 1c81c8f78bed2dc34e7289ca7c6d25a7b642533ba96d58fca93599132382c90e not found: ID does not exist" Apr 23 17:47:25.673326 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.673304 2571 scope.go:117] "RemoveContainer" containerID="2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265" Apr 23 17:47:25.673524 ip-10-0-138-17 kubenswrapper[2571]: E0423 17:47:25.673506 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265\": container with ID starting with 2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265 not found: ID does not exist" containerID="2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265" Apr 23 17:47:25.673583 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.673533 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265"} err="failed to get container status \"2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265\": rpc error: code = NotFound desc = could not find container \"2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265\": container with ID starting with 2b0e5d8e4e64469fae8ebaa6c6d2fdccdf3605052284320f634e9ad78b052265 not found: ID does not exist" Apr 23 17:47:25.677204 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.677184 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d77897766-fz4ms"] Apr 23 17:47:25.681028 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:25.681008 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d77897766-fz4ms"] Apr 23 17:47:26.492376 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:47:26.492345 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5052b556-e070-4527-add1-23d7af9bfa63" path="/var/lib/kubelet/pods/5052b556-e070-4527-add1-23d7af9bfa63/volumes" Apr 23 17:51:04.382485 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:51:04.382457 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 17:51:04.384070 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:51:04.384040 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 17:56:04.400810 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:56:04.400759 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 17:56:04.402320 ip-10-0-138-17 kubenswrapper[2571]: I0423 17:56:04.402298 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:01:04.419366 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:01:04.419338 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:01:04.420853 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:01:04.420829 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:04:22.116307 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.116269 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x"] Apr 23 18:04:22.116865 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.116531 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.116865 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.116547 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.116865 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.116558 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.116865 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.116566 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.116865 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.116640 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.116865 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.116677 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.119373 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.119355 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:22.121710 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.121690 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 18:04:22.122141 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.122121 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 18:04:22.122258 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.122158 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-6rcgh\"" Apr 23 18:04:22.124435 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.124411 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 18:04:22.132174 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.132152 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x"] Apr 23 18:04:22.143300 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.143275 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-m6rqt"] Apr 23 18:04:22.143542 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.143526 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.143625 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.143544 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.143625 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.143554 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.143625 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.143562 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.143625 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.143619 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.143625 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.143629 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5052b556-e070-4527-add1-23d7af9bfa63" containerName="registry" Apr 23 18:04:22.146190 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.146173 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:22.148456 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.148439 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 18:04:22.148552 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.148465 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-zkdjc\"" Apr 23 18:04:22.154139 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.154114 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-m6rqt"] Apr 23 18:04:22.167373 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.167346 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctwmp\" (UniqueName: \"kubernetes.io/projected/0573ddb1-4af0-4efa-8816-2ba7405c2617-kube-api-access-ctwmp\") pod \"llmisvc-controller-manager-68cc5db7c4-xzp9x\" (UID: \"0573ddb1-4af0-4efa-8816-2ba7405c2617\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:22.167512 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.167380 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86shh\" (UniqueName: \"kubernetes.io/projected/44015263-b752-4776-bca1-9a63bf498d52-kube-api-access-86shh\") pod \"seaweedfs-86cc847c5c-m6rqt\" (UID: \"44015263-b752-4776-bca1-9a63bf498d52\") " pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:22.167512 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.167452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0573ddb1-4af0-4efa-8816-2ba7405c2617-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xzp9x\" (UID: \"0573ddb1-4af0-4efa-8816-2ba7405c2617\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:22.167512 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.167485 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/44015263-b752-4776-bca1-9a63bf498d52-data\") pod \"seaweedfs-86cc847c5c-m6rqt\" (UID: \"44015263-b752-4776-bca1-9a63bf498d52\") " pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:22.267822 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.267761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0573ddb1-4af0-4efa-8816-2ba7405c2617-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xzp9x\" (UID: \"0573ddb1-4af0-4efa-8816-2ba7405c2617\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:22.267822 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.267820 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/44015263-b752-4776-bca1-9a63bf498d52-data\") pod \"seaweedfs-86cc847c5c-m6rqt\" (UID: \"44015263-b752-4776-bca1-9a63bf498d52\") " pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:22.268094 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:04:22.267863 2571 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 23 18:04:22.268094 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.267870 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctwmp\" (UniqueName: \"kubernetes.io/projected/0573ddb1-4af0-4efa-8816-2ba7405c2617-kube-api-access-ctwmp\") pod \"llmisvc-controller-manager-68cc5db7c4-xzp9x\" (UID: \"0573ddb1-4af0-4efa-8816-2ba7405c2617\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:22.268094 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:04:22.267923 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0573ddb1-4af0-4efa-8816-2ba7405c2617-cert podName:0573ddb1-4af0-4efa-8816-2ba7405c2617 nodeName:}" failed. No retries permitted until 2026-04-23 18:04:22.767904149 +0000 UTC m=+1398.846476110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0573ddb1-4af0-4efa-8816-2ba7405c2617-cert") pod "llmisvc-controller-manager-68cc5db7c4-xzp9x" (UID: "0573ddb1-4af0-4efa-8816-2ba7405c2617") : secret "llmisvc-webhook-server-cert" not found Apr 23 18:04:22.268094 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.267986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86shh\" (UniqueName: \"kubernetes.io/projected/44015263-b752-4776-bca1-9a63bf498d52-kube-api-access-86shh\") pod \"seaweedfs-86cc847c5c-m6rqt\" (UID: \"44015263-b752-4776-bca1-9a63bf498d52\") " pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:22.268318 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.268263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/44015263-b752-4776-bca1-9a63bf498d52-data\") pod \"seaweedfs-86cc847c5c-m6rqt\" (UID: \"44015263-b752-4776-bca1-9a63bf498d52\") " pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:22.277559 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.277528 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctwmp\" (UniqueName: \"kubernetes.io/projected/0573ddb1-4af0-4efa-8816-2ba7405c2617-kube-api-access-ctwmp\") pod \"llmisvc-controller-manager-68cc5db7c4-xzp9x\" (UID: \"0573ddb1-4af0-4efa-8816-2ba7405c2617\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:22.281256 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.279065 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86shh\" (UniqueName: \"kubernetes.io/projected/44015263-b752-4776-bca1-9a63bf498d52-kube-api-access-86shh\") pod \"seaweedfs-86cc847c5c-m6rqt\" (UID: \"44015263-b752-4776-bca1-9a63bf498d52\") " pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:22.456304 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.456204 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:22.580211 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.580181 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-m6rqt"] Apr 23 18:04:22.583714 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:04:22.583688 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44015263_b752_4776_bca1_9a63bf498d52.slice/crio-3819ed475921d95bac7c1040d6fe0f96d0dfa49636d9ab7c6954ac41e86062de WatchSource:0}: Error finding container 3819ed475921d95bac7c1040d6fe0f96d0dfa49636d9ab7c6954ac41e86062de: Status 404 returned error can't find the container with id 3819ed475921d95bac7c1040d6fe0f96d0dfa49636d9ab7c6954ac41e86062de Apr 23 18:04:22.585061 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.585040 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:04:22.773410 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.773372 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0573ddb1-4af0-4efa-8816-2ba7405c2617-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xzp9x\" (UID: \"0573ddb1-4af0-4efa-8816-2ba7405c2617\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:22.775873 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:22.775847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0573ddb1-4af0-4efa-8816-2ba7405c2617-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xzp9x\" (UID: \"0573ddb1-4af0-4efa-8816-2ba7405c2617\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:23.029528 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:23.029447 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:23.216748 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:23.216702 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x"] Apr 23 18:04:23.219492 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:04:23.219462 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0573ddb1_4af0_4efa_8816_2ba7405c2617.slice/crio-ca278a567045d454abbe9e7f3f72ab617a6a3a801c57f479a67c427b555225c9 WatchSource:0}: Error finding container ca278a567045d454abbe9e7f3f72ab617a6a3a801c57f479a67c427b555225c9: Status 404 returned error can't find the container with id ca278a567045d454abbe9e7f3f72ab617a6a3a801c57f479a67c427b555225c9 Apr 23 18:04:23.228288 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:23.228255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" event={"ID":"0573ddb1-4af0-4efa-8816-2ba7405c2617","Type":"ContainerStarted","Data":"ca278a567045d454abbe9e7f3f72ab617a6a3a801c57f479a67c427b555225c9"} Apr 23 18:04:23.229333 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:23.229309 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-m6rqt" event={"ID":"44015263-b752-4776-bca1-9a63bf498d52","Type":"ContainerStarted","Data":"3819ed475921d95bac7c1040d6fe0f96d0dfa49636d9ab7c6954ac41e86062de"} Apr 23 18:04:27.242658 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:27.242618 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" event={"ID":"0573ddb1-4af0-4efa-8816-2ba7405c2617","Type":"ContainerStarted","Data":"5e43f193f3c9766905a087b04e71360ac63e164140679fbb7f5dafeac144e647"} Apr 23 18:04:27.243139 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:27.242733 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:04:27.244054 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:27.244030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-m6rqt" event={"ID":"44015263-b752-4776-bca1-9a63bf498d52","Type":"ContainerStarted","Data":"f9600fe6783b1dd375afd670a1f1fcca7cadda12e180c585c676094ab900811c"} Apr 23 18:04:27.244159 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:27.244119 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:27.262166 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:27.261061 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" podStartSLOduration=2.212729464 podStartE2EDuration="5.261043419s" podCreationTimestamp="2026-04-23 18:04:22 +0000 UTC" firstStartedPulling="2026-04-23 18:04:23.221287149 +0000 UTC m=+1399.299859105" lastFinishedPulling="2026-04-23 18:04:26.269601097 +0000 UTC m=+1402.348173060" observedRunningTime="2026-04-23 18:04:27.259919393 +0000 UTC m=+1403.338491372" watchObservedRunningTime="2026-04-23 18:04:27.261043419 +0000 UTC m=+1403.339615399" Apr 23 18:04:27.275594 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:27.275536 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-m6rqt" podStartSLOduration=1.640685501 podStartE2EDuration="5.275517333s" podCreationTimestamp="2026-04-23 18:04:22 +0000 UTC" firstStartedPulling="2026-04-23 18:04:22.585218856 +0000 UTC m=+1398.663790813" lastFinishedPulling="2026-04-23 18:04:26.220050689 +0000 UTC m=+1402.298622645" observedRunningTime="2026-04-23 18:04:27.274929028 +0000 UTC m=+1403.353501006" watchObservedRunningTime="2026-04-23 18:04:27.275517333 +0000 UTC m=+1403.354089311" Apr 23 18:04:33.249642 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:33.249610 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-m6rqt" Apr 23 18:04:58.249014 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:04:58.248939 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xzp9x" Apr 23 18:06:04.436878 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:04.436851 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:06:04.437693 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:04.437675 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:06:09.622592 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.622554 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb"] Apr 23 18:06:09.626229 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.626205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.626859 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.626837 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhdpt\" (UniqueName: \"kubernetes.io/projected/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kube-api-access-zhdpt\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.626978 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.626872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.627020 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.626973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-proxy-tls\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.627020 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.627008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.628476 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.628450 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:06:09.628655 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.628633 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:06:09.628736 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.628679 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rvdb2\"" Apr 23 18:06:09.628736 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.628697 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\"" Apr 23 18:06:09.628937 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.628675 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-1794c-predictor-serving-cert\"" Apr 23 18:06:09.636548 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.636528 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb"] Apr 23 18:06:09.728086 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.728048 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhdpt\" (UniqueName: \"kubernetes.io/projected/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kube-api-access-zhdpt\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.728086 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.728085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.728349 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.728260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-proxy-tls\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.728349 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.728295 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.728457 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:06:09.728398 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-serving-cert: secret "isvc-raw-sklearn-batcher-1794c-predictor-serving-cert" not found Apr 23 18:06:09.728534 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.728453 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.728534 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:06:09.728490 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-proxy-tls podName:5cfce20c-4e70-45d2-b4d9-c72c31bfb85e nodeName:}" failed. No retries permitted until 2026-04-23 18:06:10.228472893 +0000 UTC m=+1506.307044860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-proxy-tls") pod "isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" (UID: "5cfce20c-4e70-45d2-b4d9-c72c31bfb85e") : secret "isvc-raw-sklearn-batcher-1794c-predictor-serving-cert" not found Apr 23 18:06:09.728903 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.728883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:09.738373 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:09.738352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhdpt\" (UniqueName: \"kubernetes.io/projected/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kube-api-access-zhdpt\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:10.231574 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:10.231520 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-proxy-tls\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:10.234112 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:10.234091 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-proxy-tls\") pod \"isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:10.237073 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:10.237054 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:10.359957 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:10.359926 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb"] Apr 23 18:06:10.363128 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:06:10.363088 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cfce20c_4e70_45d2_b4d9_c72c31bfb85e.slice/crio-55104d4dc1afae109ff985686c6dc7609f023b44ee8e384d21c2db3d43b9ec92 WatchSource:0}: Error finding container 55104d4dc1afae109ff985686c6dc7609f023b44ee8e384d21c2db3d43b9ec92: Status 404 returned error can't find the container with id 55104d4dc1afae109ff985686c6dc7609f023b44ee8e384d21c2db3d43b9ec92 Apr 23 18:06:10.511026 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:10.510986 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerStarted","Data":"55104d4dc1afae109ff985686c6dc7609f023b44ee8e384d21c2db3d43b9ec92"} Apr 23 18:06:15.528180 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:15.528137 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerStarted","Data":"8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0"} Apr 23 18:06:18.537523 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:18.537438 2571 generic.go:358] "Generic (PLEG): container finished" podID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerID="8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0" exitCode=0 Apr 23 18:06:18.537900 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:18.537515 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerDied","Data":"8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0"} Apr 23 18:06:32.587645 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:32.587608 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerStarted","Data":"d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab"} Apr 23 18:06:34.594538 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:34.594502 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerStarted","Data":"03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6"} Apr 23 18:06:37.605789 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:37.605731 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerStarted","Data":"64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d"} Apr 23 18:06:37.606222 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:37.606018 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:37.606222 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:37.606152 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:37.607276 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:37.607247 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:06:37.625823 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:37.625786 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podStartSLOduration=1.941273701 podStartE2EDuration="28.62575469s" podCreationTimestamp="2026-04-23 18:06:09 +0000 UTC" firstStartedPulling="2026-04-23 18:06:10.365368506 +0000 UTC m=+1506.443940462" lastFinishedPulling="2026-04-23 18:06:37.04984948 +0000 UTC m=+1533.128421451" observedRunningTime="2026-04-23 18:06:37.623712299 +0000 UTC m=+1533.702284278" watchObservedRunningTime="2026-04-23 18:06:37.62575469 +0000 UTC m=+1533.704326667" Apr 23 18:06:38.609066 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:38.609029 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:38.609580 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:38.609125 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:06:38.610133 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:38.610109 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:38.612845 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:38.612828 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:06:39.612028 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:39.611993 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:06:39.612471 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:39.612447 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:40.615145 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:40.615103 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:06:40.615615 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:40.615490 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:50.615399 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:50.615346 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:06:50.615944 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:06:50.615821 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:00.615696 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:00.615648 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:07:00.616278 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:00.616096 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:10.616137 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:10.616087 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:07:10.616550 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:10.616527 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:20.615157 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:20.615109 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:07:20.615611 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:20.615555 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:30.616153 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:30.616106 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:07:30.616600 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:30.616572 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:40.615925 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:40.615888 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:07:40.616388 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:40.615958 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:07:54.679006 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.678915 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb"] Apr 23 18:07:54.679382 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.679333 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" containerID="cri-o://d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab" gracePeriod=30 Apr 23 18:07:54.679449 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.679373 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" containerID="cri-o://64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d" gracePeriod=30 Apr 23 18:07:54.679449 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.679393 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" containerID="cri-o://03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6" gracePeriod=30 Apr 23 18:07:54.813039 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.813002 2571 generic.go:358] "Generic (PLEG): container finished" podID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerID="03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6" exitCode=2 Apr 23 18:07:54.813208 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.813085 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerDied","Data":"03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6"} Apr 23 18:07:54.871908 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.871876 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4"] Apr 23 18:07:54.874179 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.874162 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:54.876595 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.876572 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-16a43-predictor-serving-cert\"" Apr 23 18:07:54.876680 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.876573 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\"" Apr 23 18:07:54.886916 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.886889 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4"] Apr 23 18:07:54.911042 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.911003 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:54.911221 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.911088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-proxy-tls\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:54.911221 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.911156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:54.911221 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.911191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmslc\" (UniqueName: \"kubernetes.io/projected/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kube-api-access-wmslc\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:54.974476 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.974389 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff"] Apr 23 18:07:54.976616 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.976597 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:54.979284 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.979267 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-16a43-predictor-serving-cert\"" Apr 23 18:07:54.979368 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.979272 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\"" Apr 23 18:07:54.990101 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:54.990075 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff"] Apr 23 18:07:55.011670 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.011628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-proxy-tls\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.011873 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.011677 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2688bde-9608-4191-9c1b-8401fda6fddf-isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.011873 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.011712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2688bde-9608-4191-9c1b-8401fda6fddf-proxy-tls\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.011873 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.011760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.011873 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:07:55.011811 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-serving-cert: secret "isvc-sklearn-graph-raw-16a43-predictor-serving-cert" not found Apr 23 18:07:55.011873 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.011815 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmslc\" (UniqueName: \"kubernetes.io/projected/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kube-api-access-wmslc\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.012099 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:07:55.011883 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-proxy-tls podName:a6ed8bf1-302a-4461-8dd7-b13129cd49fe nodeName:}" failed. No retries permitted until 2026-04-23 18:07:55.511860036 +0000 UTC m=+1611.590431996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-proxy-tls") pod "isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" (UID: "a6ed8bf1-302a-4461-8dd7-b13129cd49fe") : secret "isvc-sklearn-graph-raw-16a43-predictor-serving-cert" not found Apr 23 18:07:55.012099 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.011955 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.012099 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.012003 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz84m\" (UniqueName: \"kubernetes.io/projected/a2688bde-9608-4191-9c1b-8401fda6fddf-kube-api-access-fz84m\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.012099 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.012044 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2688bde-9608-4191-9c1b-8401fda6fddf-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.012362 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.012343 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.012563 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.012546 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.020853 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.020824 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmslc\" (UniqueName: \"kubernetes.io/projected/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kube-api-access-wmslc\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.113065 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.113028 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fz84m\" (UniqueName: \"kubernetes.io/projected/a2688bde-9608-4191-9c1b-8401fda6fddf-kube-api-access-fz84m\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.113254 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.113077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2688bde-9608-4191-9c1b-8401fda6fddf-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.113254 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.113134 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2688bde-9608-4191-9c1b-8401fda6fddf-isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.113254 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.113167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2688bde-9608-4191-9c1b-8401fda6fddf-proxy-tls\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.113431 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:07:55.113330 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-serving-cert: secret "isvc-xgboost-graph-raw-16a43-predictor-serving-cert" not found Apr 23 18:07:55.113431 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:07:55.113404 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2688bde-9608-4191-9c1b-8401fda6fddf-proxy-tls podName:a2688bde-9608-4191-9c1b-8401fda6fddf nodeName:}" failed. No retries permitted until 2026-04-23 18:07:55.613384594 +0000 UTC m=+1611.691956549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a2688bde-9608-4191-9c1b-8401fda6fddf-proxy-tls") pod "isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" (UID: "a2688bde-9608-4191-9c1b-8401fda6fddf") : secret "isvc-xgboost-graph-raw-16a43-predictor-serving-cert" not found Apr 23 18:07:55.113578 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.113549 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2688bde-9608-4191-9c1b-8401fda6fddf-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.113898 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.113878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2688bde-9608-4191-9c1b-8401fda6fddf-isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.122091 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.122064 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz84m\" (UniqueName: \"kubernetes.io/projected/a2688bde-9608-4191-9c1b-8401fda6fddf-kube-api-access-fz84m\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.516320 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.516282 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-proxy-tls\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.518897 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.518876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-proxy-tls\") pod \"isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.617248 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.617204 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2688bde-9608-4191-9c1b-8401fda6fddf-proxy-tls\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.619834 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.619801 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2688bde-9608-4191-9c1b-8401fda6fddf-proxy-tls\") pod \"isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.784123 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.784025 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:07:55.886157 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.886122 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:07:55.914589 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:55.914555 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4"] Apr 23 18:07:55.917593 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:07:55.917565 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ed8bf1_302a_4461_8dd7_b13129cd49fe.slice/crio-b100c01c7733677fdc93bea1b84ff8a1a4b3a7504ceaccf1ef7de4b2741322e3 WatchSource:0}: Error finding container b100c01c7733677fdc93bea1b84ff8a1a4b3a7504ceaccf1ef7de4b2741322e3: Status 404 returned error can't find the container with id b100c01c7733677fdc93bea1b84ff8a1a4b3a7504ceaccf1ef7de4b2741322e3 Apr 23 18:07:56.021899 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:56.021861 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff"] Apr 23 18:07:56.024986 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:07:56.024956 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2688bde_9608_4191_9c1b_8401fda6fddf.slice/crio-72107457dc525b739d3085345ff8f6b142657bfd906b9fe64b2478f0bc92f431 WatchSource:0}: Error finding container 72107457dc525b739d3085345ff8f6b142657bfd906b9fe64b2478f0bc92f431: Status 404 returned error can't find the container with id 72107457dc525b739d3085345ff8f6b142657bfd906b9fe64b2478f0bc92f431 Apr 23 18:07:56.820613 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:56.820569 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" event={"ID":"a2688bde-9608-4191-9c1b-8401fda6fddf","Type":"ContainerStarted","Data":"f15ee4bc91bd146b1b1ceb7585fb7b744f829af2e49a365f098e21a324c97890"} Apr 23 18:07:56.820613 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:56.820616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" event={"ID":"a2688bde-9608-4191-9c1b-8401fda6fddf","Type":"ContainerStarted","Data":"72107457dc525b739d3085345ff8f6b142657bfd906b9fe64b2478f0bc92f431"} Apr 23 18:07:56.821818 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:56.821791 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" event={"ID":"a6ed8bf1-302a-4461-8dd7-b13129cd49fe","Type":"ContainerStarted","Data":"754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692"} Apr 23 18:07:56.821818 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:56.821820 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" event={"ID":"a6ed8bf1-302a-4461-8dd7-b13129cd49fe","Type":"ContainerStarted","Data":"b100c01c7733677fdc93bea1b84ff8a1a4b3a7504ceaccf1ef7de4b2741322e3"} Apr 23 18:07:58.609639 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:58.609592 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.15:8643/healthz\": dial tcp 10.132.0.15:8643: connect: connection refused" Apr 23 18:07:59.838735 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:59.838684 2571 generic.go:358] "Generic (PLEG): container finished" podID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerID="d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab" exitCode=0 Apr 23 18:07:59.839208 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:59.838759 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerDied","Data":"d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab"} Apr 23 18:07:59.840147 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:59.840124 2571 generic.go:358] "Generic (PLEG): container finished" podID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerID="754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692" exitCode=0 Apr 23 18:07:59.840267 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:07:59.840167 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" event={"ID":"a6ed8bf1-302a-4461-8dd7-b13129cd49fe","Type":"ContainerDied","Data":"754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692"} Apr 23 18:08:00.615878 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.615817 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:08:00.616203 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.616177 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:00.844141 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.844107 2571 generic.go:358] "Generic (PLEG): container finished" podID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerID="f15ee4bc91bd146b1b1ceb7585fb7b744f829af2e49a365f098e21a324c97890" exitCode=0 Apr 23 18:08:00.844552 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.844178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" event={"ID":"a2688bde-9608-4191-9c1b-8401fda6fddf","Type":"ContainerDied","Data":"f15ee4bc91bd146b1b1ceb7585fb7b744f829af2e49a365f098e21a324c97890"} Apr 23 18:08:00.846146 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.846126 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" event={"ID":"a6ed8bf1-302a-4461-8dd7-b13129cd49fe","Type":"ContainerStarted","Data":"af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8"} Apr 23 18:08:00.846230 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.846164 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" event={"ID":"a6ed8bf1-302a-4461-8dd7-b13129cd49fe","Type":"ContainerStarted","Data":"5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f"} Apr 23 18:08:00.846392 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.846371 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:08:00.846449 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.846403 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:08:00.847524 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.847499 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:08:00.880190 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:00.880137 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podStartSLOduration=6.880121452 podStartE2EDuration="6.880121452s" podCreationTimestamp="2026-04-23 18:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:08:00.879292859 +0000 UTC m=+1616.957864837" watchObservedRunningTime="2026-04-23 18:08:00.880121452 +0000 UTC m=+1616.958693502" Apr 23 18:08:01.850063 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:01.850022 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:08:03.609429 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:03.609376 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.15:8643/healthz\": dial tcp 10.132.0.15:8643: connect: connection refused" Apr 23 18:08:06.854514 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:06.854482 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:08:06.855185 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:06.855150 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:08:08.609829 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:08.609757 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.15:8643/healthz\": dial tcp 10.132.0.15:8643: connect: connection refused" Apr 23 18:08:08.610299 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:08.609968 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:08:10.615352 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:10.615293 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:08:10.615869 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:10.615606 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:13.609752 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:13.609703 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.15:8643/healthz\": dial tcp 10.132.0.15:8643: connect: connection refused" Apr 23 18:08:16.855340 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:16.855296 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:08:18.609663 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:18.609577 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.15:8643/healthz\": dial tcp 10.132.0.15:8643: connect: connection refused" Apr 23 18:08:18.905562 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:18.905482 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" event={"ID":"a2688bde-9608-4191-9c1b-8401fda6fddf","Type":"ContainerStarted","Data":"5b9aa98381845c8656363dd0fc9856fbeb4b0f16309245ae21c98ba5b1f82099"} Apr 23 18:08:18.905562 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:18.905522 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" event={"ID":"a2688bde-9608-4191-9c1b-8401fda6fddf","Type":"ContainerStarted","Data":"c5ac0c10da0b06223f3a9c73c44f6e9b5a29f0f9bcdb90a16e20783ae8d771df"} Apr 23 18:08:18.905737 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:18.905724 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:08:18.925598 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:18.925408 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podStartSLOduration=7.4611205179999995 podStartE2EDuration="24.925391073s" podCreationTimestamp="2026-04-23 18:07:54 +0000 UTC" firstStartedPulling="2026-04-23 18:08:00.845590773 +0000 UTC m=+1616.924162728" lastFinishedPulling="2026-04-23 18:08:18.309861323 +0000 UTC m=+1634.388433283" observedRunningTime="2026-04-23 18:08:18.923876329 +0000 UTC m=+1635.002448307" watchObservedRunningTime="2026-04-23 18:08:18.925391073 +0000 UTC m=+1635.003963056" Apr 23 18:08:19.908903 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:19.908868 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:08:19.910086 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:19.910061 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 18:08:20.615423 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:20.615377 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.15:8080: connect: connection refused" Apr 23 18:08:20.615621 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:20.615547 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:08:20.615741 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:20.615716 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:20.615856 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:20.615843 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:08:20.912583 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:20.912500 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 18:08:23.609241 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:23.609196 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.15:8643/healthz\": dial tcp 10.132.0.15:8643: connect: connection refused" Apr 23 18:08:25.355020 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.354995 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:08:25.460542 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.460504 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhdpt\" (UniqueName: \"kubernetes.io/projected/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kube-api-access-zhdpt\") pod \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " Apr 23 18:08:25.460750 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.460587 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-proxy-tls\") pod \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " Apr 23 18:08:25.460750 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.460705 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kserve-provision-location\") pod \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " Apr 23 18:08:25.460912 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.460816 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\") pod \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\" (UID: \"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e\") " Apr 23 18:08:25.461103 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.461070 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" (UID: "5cfce20c-4e70-45d2-b4d9-c72c31bfb85e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:08:25.461220 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.461191 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config") pod "5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" (UID: "5cfce20c-4e70-45d2-b4d9-c72c31bfb85e"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:08:25.462939 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.462912 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kube-api-access-zhdpt" (OuterVolumeSpecName: "kube-api-access-zhdpt") pod "5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" (UID: "5cfce20c-4e70-45d2-b4d9-c72c31bfb85e"). InnerVolumeSpecName "kube-api-access-zhdpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:08:25.462939 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.462912 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" (UID: "5cfce20c-4e70-45d2-b4d9-c72c31bfb85e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:08:25.561893 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.561853 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:08:25.561893 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.561887 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-isvc-raw-sklearn-batcher-1794c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:08:25.561893 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.561899 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zhdpt\" (UniqueName: \"kubernetes.io/projected/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-kube-api-access-zhdpt\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:08:25.562123 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.561910 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:08:25.917110 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.917083 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:08:25.917712 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.917685 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 18:08:25.928307 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.928274 2571 generic.go:358] "Generic (PLEG): container finished" podID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerID="64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d" exitCode=0 Apr 23 18:08:25.928463 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.928321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerDied","Data":"64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d"} Apr 23 18:08:25.928463 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.928359 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" event={"ID":"5cfce20c-4e70-45d2-b4d9-c72c31bfb85e","Type":"ContainerDied","Data":"55104d4dc1afae109ff985686c6dc7609f023b44ee8e384d21c2db3d43b9ec92"} Apr 23 18:08:25.928463 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.928361 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb" Apr 23 18:08:25.928463 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.928374 2571 scope.go:117] "RemoveContainer" containerID="64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d" Apr 23 18:08:25.936898 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.936619 2571 scope.go:117] "RemoveContainer" containerID="03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6" Apr 23 18:08:25.944651 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.944634 2571 scope.go:117] "RemoveContainer" containerID="d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab" Apr 23 18:08:25.951132 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.951087 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb"] Apr 23 18:08:25.952231 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.952213 2571 scope.go:117] "RemoveContainer" containerID="8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0" Apr 23 18:08:25.957680 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.957657 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1794c-predictor-66bb5947d5-9xzhb"] Apr 23 18:08:25.960305 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.960282 2571 scope.go:117] "RemoveContainer" containerID="64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d" Apr 23 18:08:25.960584 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:08:25.960564 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d\": container with ID starting with 64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d not found: ID does not exist" containerID="64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d" Apr 23 18:08:25.960640 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.960593 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d"} err="failed to get container status \"64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d\": rpc error: code = NotFound desc = could not find container \"64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d\": container with ID starting with 64e0c7bd0442f4da75b089d8111a410bebdb73f544bad130908649bd863ff09d not found: ID does not exist" Apr 23 18:08:25.960640 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.960619 2571 scope.go:117] "RemoveContainer" containerID="03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6" Apr 23 18:08:25.960888 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:08:25.960870 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6\": container with ID starting with 03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6 not found: ID does not exist" containerID="03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6" Apr 23 18:08:25.960948 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.960894 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6"} err="failed to get container status \"03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6\": rpc error: code = NotFound desc = could not find container \"03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6\": container with ID starting with 03f984002f8660679a5fc44d5f11671630efbe56f817fd4c4a51b44f9675adf6 not found: ID does not exist" Apr 23 18:08:25.960948 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.960912 2571 scope.go:117] "RemoveContainer" containerID="d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab" Apr 23 18:08:25.961178 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:08:25.961157 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab\": container with ID starting with d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab not found: ID does not exist" containerID="d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab" Apr 23 18:08:25.961260 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.961183 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab"} err="failed to get container status \"d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab\": rpc error: code = NotFound desc = could not find container \"d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab\": container with ID starting with d85eb153d6daea8d8b746bb4b725ae04a89f5b4481c5591a9aac7fbde9f54bab not found: ID does not exist" Apr 23 18:08:25.961260 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.961199 2571 scope.go:117] "RemoveContainer" containerID="8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0" Apr 23 18:08:25.961405 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:08:25.961389 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0\": container with ID starting with 8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0 not found: ID does not exist" containerID="8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0" Apr 23 18:08:25.961446 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:25.961412 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0"} err="failed to get container status \"8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0\": rpc error: code = NotFound desc = could not find container \"8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0\": container with ID starting with 8771eca0ac0adb3454322e805b66a8585a5508c98883437afe0799e50e2788d0 not found: ID does not exist" Apr 23 18:08:26.492382 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:26.492345 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" path="/var/lib/kubelet/pods/5cfce20c-4e70-45d2-b4d9-c72c31bfb85e/volumes" Apr 23 18:08:26.855925 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:26.855890 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:08:35.917983 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:35.917943 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 18:08:36.855793 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:36.855737 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:08:45.918578 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:45.918536 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 18:08:46.855914 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:46.855873 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:08:55.917879 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:55.917837 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 18:08:56.855833 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:08:56.855790 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:09:05.918476 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:05.918436 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 18:09:06.855841 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:06.855797 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:09:15.918349 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:15.918307 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 18:09:16.856578 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:16.856546 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:09:25.918689 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:25.918612 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:09:45.082337 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.082294 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4"] Apr 23 18:09:45.082847 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.082717 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" containerID="cri-o://5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f" gracePeriod=30 Apr 23 18:09:45.082847 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.082820 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kube-rbac-proxy" containerID="cri-o://af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8" gracePeriod=30 Apr 23 18:09:45.135045 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135010 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng"] Apr 23 18:09:45.135268 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135254 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" Apr 23 18:09:45.135268 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135268 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" Apr 23 18:09:45.135375 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135285 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" Apr 23 18:09:45.135375 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135290 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" Apr 23 18:09:45.135375 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135301 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" Apr 23 18:09:45.135375 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135306 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" Apr 23 18:09:45.135375 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135314 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="storage-initializer" Apr 23 18:09:45.135375 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135322 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="storage-initializer" Apr 23 18:09:45.135375 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135376 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kserve-container" Apr 23 18:09:45.135578 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135384 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="agent" Apr 23 18:09:45.135578 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.135391 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cfce20c-4e70-45d2-b4d9-c72c31bfb85e" containerName="kube-rbac-proxy" Apr 23 18:09:45.138257 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.138242 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.140041 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.140020 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\"" Apr 23 18:09:45.140172 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.140156 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-d4310-predictor-serving-cert\"" Apr 23 18:09:45.148508 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.148488 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng"] Apr 23 18:09:45.253550 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.253518 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d288688-f7a0-4ced-835e-78c96d19e956-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.253712 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.253638 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d288688-f7a0-4ced-835e-78c96d19e956-isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.253784 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.253708 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d288688-f7a0-4ced-835e-78c96d19e956-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.253784 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.253743 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s75rb\" (UniqueName: \"kubernetes.io/projected/5d288688-f7a0-4ced-835e-78c96d19e956-kube-api-access-s75rb\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.284231 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.284202 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff"] Apr 23 18:09:45.284535 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.284513 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" containerID="cri-o://c5ac0c10da0b06223f3a9c73c44f6e9b5a29f0f9bcdb90a16e20783ae8d771df" gracePeriod=30 Apr 23 18:09:45.284630 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.284604 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kube-rbac-proxy" containerID="cri-o://5b9aa98381845c8656363dd0fc9856fbeb4b0f16309245ae21c98ba5b1f82099" gracePeriod=30 Apr 23 18:09:45.289005 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.288979 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89"] Apr 23 18:09:45.298151 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.298132 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.300427 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.300310 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\"" Apr 23 18:09:45.300742 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.300701 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-d4310-predictor-serving-cert\"" Apr 23 18:09:45.301897 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.301858 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89"] Apr 23 18:09:45.355029 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.354968 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.355029 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.354998 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwcqk\" (UniqueName: \"kubernetes.io/projected/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kube-api-access-vwcqk\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.355029 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.355026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d288688-f7a0-4ced-835e-78c96d19e956-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.355207 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.355093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d288688-f7a0-4ced-835e-78c96d19e956-isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.355207 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.355143 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d288688-f7a0-4ced-835e-78c96d19e956-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.355207 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.355171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s75rb\" (UniqueName: \"kubernetes.io/projected/5d288688-f7a0-4ced-835e-78c96d19e956-kube-api-access-s75rb\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.355352 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.355210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9b34dab-0eb1-452c-981d-98e34b78bdcf-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.355352 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.355240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9b34dab-0eb1-452c-981d-98e34b78bdcf-isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.355352 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:09:45.355288 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-serving-cert: secret "isvc-sklearn-graph-raw-hpa-d4310-predictor-serving-cert" not found Apr 23 18:09:45.355498 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:09:45.355366 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d288688-f7a0-4ced-835e-78c96d19e956-proxy-tls podName:5d288688-f7a0-4ced-835e-78c96d19e956 nodeName:}" failed. No retries permitted until 2026-04-23 18:09:45.855346805 +0000 UTC m=+1721.933918766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5d288688-f7a0-4ced-835e-78c96d19e956-proxy-tls") pod "isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" (UID: "5d288688-f7a0-4ced-835e-78c96d19e956") : secret "isvc-sklearn-graph-raw-hpa-d4310-predictor-serving-cert" not found Apr 23 18:09:45.355498 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.355388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d288688-f7a0-4ced-835e-78c96d19e956-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.355664 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.355646 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d288688-f7a0-4ced-835e-78c96d19e956-isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.363458 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.363434 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s75rb\" (UniqueName: \"kubernetes.io/projected/5d288688-f7a0-4ced-835e-78c96d19e956-kube-api-access-s75rb\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.456361 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.456329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9b34dab-0eb1-452c-981d-98e34b78bdcf-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.456524 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.456368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9b34dab-0eb1-452c-981d-98e34b78bdcf-isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.456589 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.456517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.456589 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.456576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwcqk\" (UniqueName: \"kubernetes.io/projected/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kube-api-access-vwcqk\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.456947 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.456910 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.457082 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.457023 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9b34dab-0eb1-452c-981d-98e34b78bdcf-isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.458849 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.458831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9b34dab-0eb1-452c-981d-98e34b78bdcf-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.465056 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.465032 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwcqk\" (UniqueName: \"kubernetes.io/projected/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kube-api-access-vwcqk\") pod \"isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.610006 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.609920 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:45.731837 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.731812 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89"] Apr 23 18:09:45.734428 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:09:45.734383 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9b34dab_0eb1_452c_981d_98e34b78bdcf.slice/crio-8b0a151d07794a6d4960fc8217122882386253e3d1a9348ad5ac7acaa9bdc805 WatchSource:0}: Error finding container 8b0a151d07794a6d4960fc8217122882386253e3d1a9348ad5ac7acaa9bdc805: Status 404 returned error can't find the container with id 8b0a151d07794a6d4960fc8217122882386253e3d1a9348ad5ac7acaa9bdc805 Apr 23 18:09:45.736242 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.736226 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:09:45.859570 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.859539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d288688-f7a0-4ced-835e-78c96d19e956-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.862123 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.862062 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d288688-f7a0-4ced-835e-78c96d19e956-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:45.913127 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.913088 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.17:8643/healthz\": dial tcp 10.132.0.17:8643: connect: connection refused" Apr 23 18:09:45.917646 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:45.917616 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 18:09:46.050709 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.050668 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:46.146958 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.146921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" event={"ID":"f9b34dab-0eb1-452c-981d-98e34b78bdcf","Type":"ContainerStarted","Data":"2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8"} Apr 23 18:09:46.147362 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.146967 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" event={"ID":"f9b34dab-0eb1-452c-981d-98e34b78bdcf","Type":"ContainerStarted","Data":"8b0a151d07794a6d4960fc8217122882386253e3d1a9348ad5ac7acaa9bdc805"} Apr 23 18:09:46.149194 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.149166 2571 generic.go:358] "Generic (PLEG): container finished" podID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerID="5b9aa98381845c8656363dd0fc9856fbeb4b0f16309245ae21c98ba5b1f82099" exitCode=2 Apr 23 18:09:46.149325 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.149220 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" event={"ID":"a2688bde-9608-4191-9c1b-8401fda6fddf","Type":"ContainerDied","Data":"5b9aa98381845c8656363dd0fc9856fbeb4b0f16309245ae21c98ba5b1f82099"} Apr 23 18:09:46.152618 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.152593 2571 generic.go:358] "Generic (PLEG): container finished" podID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerID="af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8" exitCode=2 Apr 23 18:09:46.152727 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.152631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" event={"ID":"a6ed8bf1-302a-4461-8dd7-b13129cd49fe","Type":"ContainerDied","Data":"af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8"} Apr 23 18:09:46.177413 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.177381 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng"] Apr 23 18:09:46.181224 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:09:46.181198 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d288688_f7a0_4ced_835e_78c96d19e956.slice/crio-8ba0e1a3186d5d4c5e10531c116e40b299f96e0b906c23049f0c3c756d6527fd WatchSource:0}: Error finding container 8ba0e1a3186d5d4c5e10531c116e40b299f96e0b906c23049f0c3c756d6527fd: Status 404 returned error can't find the container with id 8ba0e1a3186d5d4c5e10531c116e40b299f96e0b906c23049f0c3c756d6527fd Apr 23 18:09:46.850936 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.850896 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.16:8643/healthz\": dial tcp 10.132.0.16:8643: connect: connection refused" Apr 23 18:09:46.855245 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:46.855207 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 18:09:47.156867 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:47.156764 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" event={"ID":"5d288688-f7a0-4ced-835e-78c96d19e956","Type":"ContainerStarted","Data":"ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829"} Apr 23 18:09:47.156867 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:47.156820 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" event={"ID":"5d288688-f7a0-4ced-835e-78c96d19e956","Type":"ContainerStarted","Data":"8ba0e1a3186d5d4c5e10531c116e40b299f96e0b906c23049f0c3c756d6527fd"} Apr 23 18:09:49.165559 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.165525 2571 generic.go:358] "Generic (PLEG): container finished" podID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerID="c5ac0c10da0b06223f3a9c73c44f6e9b5a29f0f9bcdb90a16e20783ae8d771df" exitCode=0 Apr 23 18:09:49.166014 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.165584 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" event={"ID":"a2688bde-9608-4191-9c1b-8401fda6fddf","Type":"ContainerDied","Data":"c5ac0c10da0b06223f3a9c73c44f6e9b5a29f0f9bcdb90a16e20783ae8d771df"} Apr 23 18:09:49.212730 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.212712 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:09:49.286971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.286907 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2688bde-9608-4191-9c1b-8401fda6fddf-isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\") pod \"a2688bde-9608-4191-9c1b-8401fda6fddf\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " Apr 23 18:09:49.287086 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.286979 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2688bde-9608-4191-9c1b-8401fda6fddf-kserve-provision-location\") pod \"a2688bde-9608-4191-9c1b-8401fda6fddf\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " Apr 23 18:09:49.287086 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.287002 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz84m\" (UniqueName: \"kubernetes.io/projected/a2688bde-9608-4191-9c1b-8401fda6fddf-kube-api-access-fz84m\") pod \"a2688bde-9608-4191-9c1b-8401fda6fddf\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " Apr 23 18:09:49.287086 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.287023 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2688bde-9608-4191-9c1b-8401fda6fddf-proxy-tls\") pod \"a2688bde-9608-4191-9c1b-8401fda6fddf\" (UID: \"a2688bde-9608-4191-9c1b-8401fda6fddf\") " Apr 23 18:09:49.287271 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.287247 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2688bde-9608-4191-9c1b-8401fda6fddf-isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config") pod "a2688bde-9608-4191-9c1b-8401fda6fddf" (UID: "a2688bde-9608-4191-9c1b-8401fda6fddf"). InnerVolumeSpecName "isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:09:49.287351 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.287327 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2688bde-9608-4191-9c1b-8401fda6fddf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a2688bde-9608-4191-9c1b-8401fda6fddf" (UID: "a2688bde-9608-4191-9c1b-8401fda6fddf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:09:49.289228 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.289208 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2688bde-9608-4191-9c1b-8401fda6fddf-kube-api-access-fz84m" (OuterVolumeSpecName: "kube-api-access-fz84m") pod "a2688bde-9608-4191-9c1b-8401fda6fddf" (UID: "a2688bde-9608-4191-9c1b-8401fda6fddf"). InnerVolumeSpecName "kube-api-access-fz84m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:09:49.289228 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.289209 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2688bde-9608-4191-9c1b-8401fda6fddf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a2688bde-9608-4191-9c1b-8401fda6fddf" (UID: "a2688bde-9608-4191-9c1b-8401fda6fddf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:09:49.388465 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.388440 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2688bde-9608-4191-9c1b-8401fda6fddf-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:09:49.388465 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.388463 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fz84m\" (UniqueName: \"kubernetes.io/projected/a2688bde-9608-4191-9c1b-8401fda6fddf-kube-api-access-fz84m\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:09:49.388622 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.388474 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2688bde-9608-4191-9c1b-8401fda6fddf-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:09:49.388622 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.388485 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2688bde-9608-4191-9c1b-8401fda6fddf-isvc-xgboost-graph-raw-16a43-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:09:49.616322 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.616296 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:09:49.691581 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.691547 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kserve-provision-location\") pod \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " Apr 23 18:09:49.691738 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.691634 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmslc\" (UniqueName: \"kubernetes.io/projected/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kube-api-access-wmslc\") pod \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " Apr 23 18:09:49.691738 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.691702 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-proxy-tls\") pod \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " Apr 23 18:09:49.691850 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.691799 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\") pod \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\" (UID: \"a6ed8bf1-302a-4461-8dd7-b13129cd49fe\") " Apr 23 18:09:49.691850 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.691834 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a6ed8bf1-302a-4461-8dd7-b13129cd49fe" (UID: "a6ed8bf1-302a-4461-8dd7-b13129cd49fe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:09:49.691978 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.691964 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:09:49.692158 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.692133 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config") pod "a6ed8bf1-302a-4461-8dd7-b13129cd49fe" (UID: "a6ed8bf1-302a-4461-8dd7-b13129cd49fe"). InnerVolumeSpecName "isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:09:49.693878 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.693859 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a6ed8bf1-302a-4461-8dd7-b13129cd49fe" (UID: "a6ed8bf1-302a-4461-8dd7-b13129cd49fe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:09:49.693938 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.693907 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kube-api-access-wmslc" (OuterVolumeSpecName: "kube-api-access-wmslc") pod "a6ed8bf1-302a-4461-8dd7-b13129cd49fe" (UID: "a6ed8bf1-302a-4461-8dd7-b13129cd49fe"). InnerVolumeSpecName "kube-api-access-wmslc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:09:49.793319 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.793281 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wmslc\" (UniqueName: \"kubernetes.io/projected/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-kube-api-access-wmslc\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:09:49.793319 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.793317 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:09:49.793467 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:49.793330 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6ed8bf1-302a-4461-8dd7-b13129cd49fe-isvc-sklearn-graph-raw-16a43-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:09:50.169481 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.169448 2571 generic.go:358] "Generic (PLEG): container finished" podID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerID="2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8" exitCode=0 Apr 23 18:09:50.169886 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.169523 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" event={"ID":"f9b34dab-0eb1-452c-981d-98e34b78bdcf","Type":"ContainerDied","Data":"2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8"} Apr 23 18:09:50.170922 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.170898 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d288688-f7a0-4ced-835e-78c96d19e956" containerID="ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829" exitCode=0 Apr 23 18:09:50.171031 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.170985 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" event={"ID":"5d288688-f7a0-4ced-835e-78c96d19e956","Type":"ContainerDied","Data":"ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829"} Apr 23 18:09:50.172753 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.172730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" event={"ID":"a2688bde-9608-4191-9c1b-8401fda6fddf","Type":"ContainerDied","Data":"72107457dc525b739d3085345ff8f6b142657bfd906b9fe64b2478f0bc92f431"} Apr 23 18:09:50.172948 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.172832 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff" Apr 23 18:09:50.172948 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.172785 2571 scope.go:117] "RemoveContainer" containerID="5b9aa98381845c8656363dd0fc9856fbeb4b0f16309245ae21c98ba5b1f82099" Apr 23 18:09:50.174755 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.174730 2571 generic.go:358] "Generic (PLEG): container finished" podID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerID="5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f" exitCode=0 Apr 23 18:09:50.174853 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.174791 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" event={"ID":"a6ed8bf1-302a-4461-8dd7-b13129cd49fe","Type":"ContainerDied","Data":"5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f"} Apr 23 18:09:50.174853 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.174815 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" event={"ID":"a6ed8bf1-302a-4461-8dd7-b13129cd49fe","Type":"ContainerDied","Data":"b100c01c7733677fdc93bea1b84ff8a1a4b3a7504ceaccf1ef7de4b2741322e3"} Apr 23 18:09:50.174853 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.174827 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4" Apr 23 18:09:50.183249 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.183228 2571 scope.go:117] "RemoveContainer" containerID="c5ac0c10da0b06223f3a9c73c44f6e9b5a29f0f9bcdb90a16e20783ae8d771df" Apr 23 18:09:50.192857 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.192838 2571 scope.go:117] "RemoveContainer" containerID="f15ee4bc91bd146b1b1ceb7585fb7b744f829af2e49a365f098e21a324c97890" Apr 23 18:09:50.201122 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.201101 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff"] Apr 23 18:09:50.205818 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.205762 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-16a43-predictor-64c6885bc7-lr2ff"] Apr 23 18:09:50.209529 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.209509 2571 scope.go:117] "RemoveContainer" containerID="af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8" Apr 23 18:09:50.216423 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.216405 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4"] Apr 23 18:09:50.221063 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.221043 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-16a43-predictor-7d9664f6fc-c8qq4"] Apr 23 18:09:50.225844 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.225824 2571 scope.go:117] "RemoveContainer" containerID="5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f" Apr 23 18:09:50.235619 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.235598 2571 scope.go:117] "RemoveContainer" containerID="754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692" Apr 23 18:09:50.243303 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.243234 2571 scope.go:117] "RemoveContainer" containerID="af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8" Apr 23 18:09:50.243503 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:09:50.243484 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8\": container with ID starting with af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8 not found: ID does not exist" containerID="af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8" Apr 23 18:09:50.243574 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.243512 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8"} err="failed to get container status \"af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8\": rpc error: code = NotFound desc = could not find container \"af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8\": container with ID starting with af967cb1a63525093d339eac27189e2f6b768fbf870fa82057f5ee64739a95e8 not found: ID does not exist" Apr 23 18:09:50.243574 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.243530 2571 scope.go:117] "RemoveContainer" containerID="5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f" Apr 23 18:09:50.243818 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:09:50.243796 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f\": container with ID starting with 5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f not found: ID does not exist" containerID="5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f" Apr 23 18:09:50.243908 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.243824 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f"} err="failed to get container status \"5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f\": rpc error: code = NotFound desc = could not find container \"5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f\": container with ID starting with 5767168ea85dfd578a2b57ab67e3147e32415372588008f7399c2dfa50f1635f not found: ID does not exist" Apr 23 18:09:50.243908 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.243841 2571 scope.go:117] "RemoveContainer" containerID="754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692" Apr 23 18:09:50.244106 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:09:50.244080 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692\": container with ID starting with 754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692 not found: ID does not exist" containerID="754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692" Apr 23 18:09:50.244166 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.244113 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692"} err="failed to get container status \"754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692\": rpc error: code = NotFound desc = could not find container \"754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692\": container with ID starting with 754e4fc4419de330a211a07143326e56cca1bf8b089ebf29ba7701ea68aab692 not found: ID does not exist" Apr 23 18:09:50.492840 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.492730 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" path="/var/lib/kubelet/pods/a2688bde-9608-4191-9c1b-8401fda6fddf/volumes" Apr 23 18:09:50.493483 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:50.493452 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" path="/var/lib/kubelet/pods/a6ed8bf1-302a-4461-8dd7-b13129cd49fe/volumes" Apr 23 18:09:51.181440 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.181399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" event={"ID":"f9b34dab-0eb1-452c-981d-98e34b78bdcf","Type":"ContainerStarted","Data":"f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf"} Apr 23 18:09:51.181440 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.181439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" event={"ID":"f9b34dab-0eb1-452c-981d-98e34b78bdcf","Type":"ContainerStarted","Data":"56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b"} Apr 23 18:09:51.182001 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.181696 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:51.183180 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.183158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" event={"ID":"5d288688-f7a0-4ced-835e-78c96d19e956","Type":"ContainerStarted","Data":"c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8"} Apr 23 18:09:51.183180 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.183183 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" event={"ID":"5d288688-f7a0-4ced-835e-78c96d19e956","Type":"ContainerStarted","Data":"537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b"} Apr 23 18:09:51.183516 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.183498 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:51.183574 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.183523 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:51.184762 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.184737 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 18:09:51.201011 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.200962 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podStartSLOduration=6.200945234 podStartE2EDuration="6.200945234s" podCreationTimestamp="2026-04-23 18:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:09:51.200529038 +0000 UTC m=+1727.279101057" watchObservedRunningTime="2026-04-23 18:09:51.200945234 +0000 UTC m=+1727.279517214" Apr 23 18:09:51.219130 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:51.219084 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podStartSLOduration=6.219068351 podStartE2EDuration="6.219068351s" podCreationTimestamp="2026-04-23 18:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:09:51.218513817 +0000 UTC m=+1727.297085794" watchObservedRunningTime="2026-04-23 18:09:51.219068351 +0000 UTC m=+1727.297640400" Apr 23 18:09:52.187346 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:52.187300 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 18:09:52.187712 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:52.187362 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:52.188290 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:52.188264 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 23 18:09:53.189742 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:53.189705 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 23 18:09:57.191085 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:57.191059 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:09:57.191602 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:57.191580 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 18:09:58.194142 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:58.194105 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:09:58.194810 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:09:58.194784 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 23 18:10:07.191740 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:07.191701 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 18:10:08.194940 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:08.194892 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 23 18:10:17.192334 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:17.192295 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 18:10:18.195729 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:18.195690 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 23 18:10:27.192349 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:27.192311 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 18:10:28.194721 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:28.194677 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 23 18:10:37.191548 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:37.191507 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 18:10:38.195346 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:38.195305 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 23 18:10:47.191584 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:47.191538 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 18:10:48.195544 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:48.195493 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 23 18:10:57.192310 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:57.192229 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:10:58.196026 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:10:58.195987 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:11:04.457603 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:04.457569 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:11:04.458412 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:04.458392 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:11:25.372556 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.372524 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng"] Apr 23 18:11:25.372999 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.372857 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" containerID="cri-o://537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b" gracePeriod=30 Apr 23 18:11:25.372999 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.372905 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kube-rbac-proxy" containerID="cri-o://c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8" gracePeriod=30 Apr 23 18:11:25.412192 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412160 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62"] Apr 23 18:11:25.412491 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412475 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" Apr 23 18:11:25.412564 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412494 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" Apr 23 18:11:25.412564 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412527 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kube-rbac-proxy" Apr 23 18:11:25.412564 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412537 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kube-rbac-proxy" Apr 23 18:11:25.412564 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412550 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="storage-initializer" Apr 23 18:11:25.412564 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412559 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="storage-initializer" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412567 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kube-rbac-proxy" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412575 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kube-rbac-proxy" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412586 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412594 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412612 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="storage-initializer" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412620 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="storage-initializer" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412684 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kserve-container" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412697 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6ed8bf1-302a-4461-8dd7-b13129cd49fe" containerName="kube-rbac-proxy" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412710 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kserve-container" Apr 23 18:11:25.412905 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.412719 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2688bde-9608-4191-9c1b-8401fda6fddf" containerName="kube-rbac-proxy" Apr 23 18:11:25.415886 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.415866 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.419068 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.419047 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-befab-predictor-serving-cert\"" Apr 23 18:11:25.419068 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.419062 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-befab-kube-rbac-proxy-sar-config\"" Apr 23 18:11:25.425293 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.425267 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62"] Apr 23 18:11:25.477281 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.477247 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89"] Apr 23 18:11:25.477786 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.477627 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" containerID="cri-o://56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b" gracePeriod=30 Apr 23 18:11:25.477786 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.477721 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kube-rbac-proxy" containerID="cri-o://f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf" gracePeriod=30 Apr 23 18:11:25.492658 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.492621 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d64a7c8b-3580-449e-80d1-a32d922e434b-proxy-tls\") pod \"message-dumper-raw-befab-predictor-7b987ddd48-tqq62\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.492733 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.492688 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d64a7c8b-3580-449e-80d1-a32d922e434b-message-dumper-raw-befab-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-befab-predictor-7b987ddd48-tqq62\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.492817 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.492757 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qxqr\" (UniqueName: \"kubernetes.io/projected/d64a7c8b-3580-449e-80d1-a32d922e434b-kube-api-access-7qxqr\") pod \"message-dumper-raw-befab-predictor-7b987ddd48-tqq62\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.593916 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.593873 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qxqr\" (UniqueName: \"kubernetes.io/projected/d64a7c8b-3580-449e-80d1-a32d922e434b-kube-api-access-7qxqr\") pod \"message-dumper-raw-befab-predictor-7b987ddd48-tqq62\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.594070 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.593936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d64a7c8b-3580-449e-80d1-a32d922e434b-proxy-tls\") pod \"message-dumper-raw-befab-predictor-7b987ddd48-tqq62\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.594070 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.594004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d64a7c8b-3580-449e-80d1-a32d922e434b-message-dumper-raw-befab-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-befab-predictor-7b987ddd48-tqq62\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.594822 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.594714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d64a7c8b-3580-449e-80d1-a32d922e434b-message-dumper-raw-befab-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-befab-predictor-7b987ddd48-tqq62\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.596369 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.596346 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d64a7c8b-3580-449e-80d1-a32d922e434b-proxy-tls\") pod \"message-dumper-raw-befab-predictor-7b987ddd48-tqq62\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.603104 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.603082 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qxqr\" (UniqueName: \"kubernetes.io/projected/d64a7c8b-3580-449e-80d1-a32d922e434b-kube-api-access-7qxqr\") pod \"message-dumper-raw-befab-predictor-7b987ddd48-tqq62\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.726295 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.726197 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:25.853950 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:25.853913 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62"] Apr 23 18:11:25.857057 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:11:25.857031 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd64a7c8b_3580_449e_80d1_a32d922e434b.slice/crio-1604ec5a8b82ee19de3e51c94825286fe3d5463647beb86f8a08c4c412d78fd0 WatchSource:0}: Error finding container 1604ec5a8b82ee19de3e51c94825286fe3d5463647beb86f8a08c4c412d78fd0: Status 404 returned error can't find the container with id 1604ec5a8b82ee19de3e51c94825286fe3d5463647beb86f8a08c4c412d78fd0 Apr 23 18:11:26.448528 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:26.448498 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d288688-f7a0-4ced-835e-78c96d19e956" containerID="c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8" exitCode=2 Apr 23 18:11:26.448975 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:26.448578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" event={"ID":"5d288688-f7a0-4ced-835e-78c96d19e956","Type":"ContainerDied","Data":"c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8"} Apr 23 18:11:26.449602 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:26.449580 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" event={"ID":"d64a7c8b-3580-449e-80d1-a32d922e434b","Type":"ContainerStarted","Data":"1604ec5a8b82ee19de3e51c94825286fe3d5463647beb86f8a08c4c412d78fd0"} Apr 23 18:11:26.451368 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:26.451344 2571 generic.go:358] "Generic (PLEG): container finished" podID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerID="f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf" exitCode=2 Apr 23 18:11:26.451455 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:26.451389 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" event={"ID":"f9b34dab-0eb1-452c-981d-98e34b78bdcf","Type":"ContainerDied","Data":"f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf"} Apr 23 18:11:27.188107 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:27.188004 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.18:8643/healthz\": dial tcp 10.132.0.18:8643: connect: connection refused" Apr 23 18:11:27.192405 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:27.192371 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 18:11:27.456050 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:27.455950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" event={"ID":"d64a7c8b-3580-449e-80d1-a32d922e434b","Type":"ContainerStarted","Data":"0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f"} Apr 23 18:11:27.456050 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:27.455999 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" event={"ID":"d64a7c8b-3580-449e-80d1-a32d922e434b","Type":"ContainerStarted","Data":"6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4"} Apr 23 18:11:27.456467 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:27.456128 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:27.474290 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:27.474228 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" podStartSLOduration=1.456076012 podStartE2EDuration="2.474213204s" podCreationTimestamp="2026-04-23 18:11:25 +0000 UTC" firstStartedPulling="2026-04-23 18:11:25.858938119 +0000 UTC m=+1821.937510087" lastFinishedPulling="2026-04-23 18:11:26.877075319 +0000 UTC m=+1822.955647279" observedRunningTime="2026-04-23 18:11:27.47287241 +0000 UTC m=+1823.551444388" watchObservedRunningTime="2026-04-23 18:11:27.474213204 +0000 UTC m=+1823.552785182" Apr 23 18:11:28.190676 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:28.190636 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.19:8643/healthz\": dial tcp 10.132.0.19:8643: connect: connection refused" Apr 23 18:11:28.194933 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:28.194897 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 23 18:11:28.459046 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:28.458969 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:28.460756 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:28.460734 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:29.433977 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.433954 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:11:29.463885 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.463802 2571 generic.go:358] "Generic (PLEG): container finished" podID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerID="56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b" exitCode=0 Apr 23 18:11:29.464324 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.463937 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" Apr 23 18:11:29.464324 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.463936 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" event={"ID":"f9b34dab-0eb1-452c-981d-98e34b78bdcf","Type":"ContainerDied","Data":"56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b"} Apr 23 18:11:29.464324 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.463983 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89" event={"ID":"f9b34dab-0eb1-452c-981d-98e34b78bdcf","Type":"ContainerDied","Data":"8b0a151d07794a6d4960fc8217122882386253e3d1a9348ad5ac7acaa9bdc805"} Apr 23 18:11:29.464324 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.464004 2571 scope.go:117] "RemoveContainer" containerID="f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf" Apr 23 18:11:29.472118 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.472098 2571 scope.go:117] "RemoveContainer" containerID="56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b" Apr 23 18:11:29.479470 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.479452 2571 scope.go:117] "RemoveContainer" containerID="2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8" Apr 23 18:11:29.486240 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.486219 2571 scope.go:117] "RemoveContainer" containerID="f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf" Apr 23 18:11:29.486504 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:11:29.486476 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf\": container with ID starting with f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf not found: ID does not exist" containerID="f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf" Apr 23 18:11:29.486568 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.486508 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf"} err="failed to get container status \"f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf\": rpc error: code = NotFound desc = could not find container \"f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf\": container with ID starting with f28ebaf8087429b23bbf149cc54452bcd00cfd0b5ae137fe7f7ff9047be233bf not found: ID does not exist" Apr 23 18:11:29.486568 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.486528 2571 scope.go:117] "RemoveContainer" containerID="56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b" Apr 23 18:11:29.486807 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:11:29.486785 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b\": container with ID starting with 56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b not found: ID does not exist" containerID="56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b" Apr 23 18:11:29.486874 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.486816 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b"} err="failed to get container status \"56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b\": rpc error: code = NotFound desc = could not find container \"56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b\": container with ID starting with 56c1357ff063935268816af2f3b474dbcad9b937330eb009c28e79aa2d7f543b not found: ID does not exist" Apr 23 18:11:29.486874 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.486838 2571 scope.go:117] "RemoveContainer" containerID="2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8" Apr 23 18:11:29.487091 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:11:29.487076 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8\": container with ID starting with 2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8 not found: ID does not exist" containerID="2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8" Apr 23 18:11:29.487140 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.487094 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8"} err="failed to get container status \"2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8\": rpc error: code = NotFound desc = could not find container \"2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8\": container with ID starting with 2d209be06a5d7f8d29a152010a63c51bdf5cc2bc9c5b90b85fade1a17888c6d8 not found: ID does not exist" Apr 23 18:11:29.524493 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.524442 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwcqk\" (UniqueName: \"kubernetes.io/projected/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kube-api-access-vwcqk\") pod \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " Apr 23 18:11:29.524634 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.524510 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9b34dab-0eb1-452c-981d-98e34b78bdcf-proxy-tls\") pod \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " Apr 23 18:11:29.524634 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.524548 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kserve-provision-location\") pod \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " Apr 23 18:11:29.524634 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.524576 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9b34dab-0eb1-452c-981d-98e34b78bdcf-isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") pod \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\" (UID: \"f9b34dab-0eb1-452c-981d-98e34b78bdcf\") " Apr 23 18:11:29.524950 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.524915 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f9b34dab-0eb1-452c-981d-98e34b78bdcf" (UID: "f9b34dab-0eb1-452c-981d-98e34b78bdcf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:11:29.525048 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.524966 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b34dab-0eb1-452c-981d-98e34b78bdcf-isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config") pod "f9b34dab-0eb1-452c-981d-98e34b78bdcf" (UID: "f9b34dab-0eb1-452c-981d-98e34b78bdcf"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:11:29.526828 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.526806 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b34dab-0eb1-452c-981d-98e34b78bdcf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f9b34dab-0eb1-452c-981d-98e34b78bdcf" (UID: "f9b34dab-0eb1-452c-981d-98e34b78bdcf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:11:29.526920 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.526897 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kube-api-access-vwcqk" (OuterVolumeSpecName: "kube-api-access-vwcqk") pod "f9b34dab-0eb1-452c-981d-98e34b78bdcf" (UID: "f9b34dab-0eb1-452c-981d-98e34b78bdcf"). InnerVolumeSpecName "kube-api-access-vwcqk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:11:29.626103 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.626063 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9b34dab-0eb1-452c-981d-98e34b78bdcf-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:11:29.626103 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.626098 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:11:29.626103 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.626108 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9b34dab-0eb1-452c-981d-98e34b78bdcf-isvc-xgboost-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:11:29.626322 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.626120 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwcqk\" (UniqueName: \"kubernetes.io/projected/f9b34dab-0eb1-452c-981d-98e34b78bdcf-kube-api-access-vwcqk\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:11:29.785994 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.785961 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89"] Apr 23 18:11:29.789555 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:29.789525 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-d4310-predictor-74d9b6c5d9-mtg89"] Apr 23 18:11:30.199879 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.199857 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:11:30.331686 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.331643 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d288688-f7a0-4ced-835e-78c96d19e956-proxy-tls\") pod \"5d288688-f7a0-4ced-835e-78c96d19e956\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " Apr 23 18:11:30.331923 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.331701 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d288688-f7a0-4ced-835e-78c96d19e956-kserve-provision-location\") pod \"5d288688-f7a0-4ced-835e-78c96d19e956\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " Apr 23 18:11:30.331923 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.331737 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s75rb\" (UniqueName: \"kubernetes.io/projected/5d288688-f7a0-4ced-835e-78c96d19e956-kube-api-access-s75rb\") pod \"5d288688-f7a0-4ced-835e-78c96d19e956\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " Apr 23 18:11:30.331923 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.331851 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d288688-f7a0-4ced-835e-78c96d19e956-isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") pod \"5d288688-f7a0-4ced-835e-78c96d19e956\" (UID: \"5d288688-f7a0-4ced-835e-78c96d19e956\") " Apr 23 18:11:30.332158 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.332121 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d288688-f7a0-4ced-835e-78c96d19e956-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5d288688-f7a0-4ced-835e-78c96d19e956" (UID: "5d288688-f7a0-4ced-835e-78c96d19e956"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:11:30.332268 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.332240 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d288688-f7a0-4ced-835e-78c96d19e956-isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config") pod "5d288688-f7a0-4ced-835e-78c96d19e956" (UID: "5d288688-f7a0-4ced-835e-78c96d19e956"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:11:30.334009 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.333988 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d288688-f7a0-4ced-835e-78c96d19e956-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5d288688-f7a0-4ced-835e-78c96d19e956" (UID: "5d288688-f7a0-4ced-835e-78c96d19e956"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:11:30.334087 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.334060 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d288688-f7a0-4ced-835e-78c96d19e956-kube-api-access-s75rb" (OuterVolumeSpecName: "kube-api-access-s75rb") pod "5d288688-f7a0-4ced-835e-78c96d19e956" (UID: "5d288688-f7a0-4ced-835e-78c96d19e956"). InnerVolumeSpecName "kube-api-access-s75rb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:11:30.432924 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.432833 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d288688-f7a0-4ced-835e-78c96d19e956-isvc-sklearn-graph-raw-hpa-d4310-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:11:30.432924 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.432865 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d288688-f7a0-4ced-835e-78c96d19e956-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:11:30.432924 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.432878 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d288688-f7a0-4ced-835e-78c96d19e956-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:11:30.432924 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.432893 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s75rb\" (UniqueName: \"kubernetes.io/projected/5d288688-f7a0-4ced-835e-78c96d19e956-kube-api-access-s75rb\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:11:30.469020 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.468980 2571 generic.go:358] "Generic (PLEG): container finished" podID="5d288688-f7a0-4ced-835e-78c96d19e956" containerID="537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b" exitCode=0 Apr 23 18:11:30.469460 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.469064 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" event={"ID":"5d288688-f7a0-4ced-835e-78c96d19e956","Type":"ContainerDied","Data":"537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b"} Apr 23 18:11:30.469460 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.469076 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" Apr 23 18:11:30.469460 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.469110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng" event={"ID":"5d288688-f7a0-4ced-835e-78c96d19e956","Type":"ContainerDied","Data":"8ba0e1a3186d5d4c5e10531c116e40b299f96e0b906c23049f0c3c756d6527fd"} Apr 23 18:11:30.469460 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.469126 2571 scope.go:117] "RemoveContainer" containerID="c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8" Apr 23 18:11:30.478258 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.478237 2571 scope.go:117] "RemoveContainer" containerID="537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b" Apr 23 18:11:30.485629 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.485608 2571 scope.go:117] "RemoveContainer" containerID="ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829" Apr 23 18:11:30.493474 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.493447 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" path="/var/lib/kubelet/pods/f9b34dab-0eb1-452c-981d-98e34b78bdcf/volumes" Apr 23 18:11:30.493785 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.493756 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng"] Apr 23 18:11:30.493995 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.493978 2571 scope.go:117] "RemoveContainer" containerID="c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8" Apr 23 18:11:30.494284 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:11:30.494267 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8\": container with ID starting with c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8 not found: ID does not exist" containerID="c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8" Apr 23 18:11:30.494327 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.494294 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8"} err="failed to get container status \"c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8\": rpc error: code = NotFound desc = could not find container \"c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8\": container with ID starting with c008466c335faff1559f74783a8299f16199d4b1cf8f3ecca7d77b00bbebafe8 not found: ID does not exist" Apr 23 18:11:30.494327 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.494313 2571 scope.go:117] "RemoveContainer" containerID="537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b" Apr 23 18:11:30.494546 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:11:30.494527 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b\": container with ID starting with 537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b not found: ID does not exist" containerID="537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b" Apr 23 18:11:30.494586 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.494554 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b"} err="failed to get container status \"537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b\": rpc error: code = NotFound desc = could not find container \"537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b\": container with ID starting with 537a6d222b114e4e468bfa2938e712f1140814e78ad8101896a6e5421dc9550b not found: ID does not exist" Apr 23 18:11:30.494586 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.494577 2571 scope.go:117] "RemoveContainer" containerID="ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829" Apr 23 18:11:30.494985 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:11:30.494956 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829\": container with ID starting with ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829 not found: ID does not exist" containerID="ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829" Apr 23 18:11:30.495076 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.494992 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829"} err="failed to get container status \"ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829\": rpc error: code = NotFound desc = could not find container \"ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829\": container with ID starting with ebf9c3a110b8e088dfe14934382c797931ed439c5fb0d4ea0615a17b3db29829 not found: ID does not exist" Apr 23 18:11:30.496496 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:30.496476 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-d4310-predictor-575c475777-jbpng"] Apr 23 18:11:32.492516 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:32.492485 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" path="/var/lib/kubelet/pods/5d288688-f7a0-4ced-835e-78c96d19e956/volumes" Apr 23 18:11:35.474118 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:35.474083 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:11:45.503680 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.503642 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz"] Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.503934 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kube-rbac-proxy" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.503948 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kube-rbac-proxy" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.503968 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="storage-initializer" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.503975 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="storage-initializer" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.503987 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kube-rbac-proxy" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.503994 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kube-rbac-proxy" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504004 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504009 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504015 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504020 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504030 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="storage-initializer" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504039 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="storage-initializer" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504080 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kserve-container" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504091 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kserve-container" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504098 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d288688-f7a0-4ced-835e-78c96d19e956" containerName="kube-rbac-proxy" Apr 23 18:11:45.504098 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.504103 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9b34dab-0eb1-452c-981d-98e34b78bdcf" containerName="kube-rbac-proxy" Apr 23 18:11:45.510202 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.510179 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.512083 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.512062 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-befab-predictor-serving-cert\"" Apr 23 18:11:45.512357 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.512338 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-befab-kube-rbac-proxy-sar-config\"" Apr 23 18:11:45.518436 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.518410 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz"] Apr 23 18:11:45.646697 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.646638 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vw86\" (UniqueName: \"kubernetes.io/projected/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kube-api-access-7vw86\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.646697 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.646695 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kserve-provision-location\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.646947 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.646829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfe17efd-10c4-48fe-a81d-9f7ac372748a-proxy-tls\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.646947 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.646862 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfe17efd-10c4-48fe-a81d-9f7ac372748a-isvc-logger-raw-befab-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.747731 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.747680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfe17efd-10c4-48fe-a81d-9f7ac372748a-proxy-tls\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.747731 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.747735 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfe17efd-10c4-48fe-a81d-9f7ac372748a-isvc-logger-raw-befab-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.747968 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.747764 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vw86\" (UniqueName: \"kubernetes.io/projected/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kube-api-access-7vw86\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.747968 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.747806 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kserve-provision-location\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.747968 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:11:45.747886 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-serving-cert: secret "isvc-logger-raw-befab-predictor-serving-cert" not found Apr 23 18:11:45.748082 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:11:45.747971 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfe17efd-10c4-48fe-a81d-9f7ac372748a-proxy-tls podName:cfe17efd-10c4-48fe-a81d-9f7ac372748a nodeName:}" failed. No retries permitted until 2026-04-23 18:11:46.247949663 +0000 UTC m=+1842.326521629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cfe17efd-10c4-48fe-a81d-9f7ac372748a-proxy-tls") pod "isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" (UID: "cfe17efd-10c4-48fe-a81d-9f7ac372748a") : secret "isvc-logger-raw-befab-predictor-serving-cert" not found Apr 23 18:11:45.748244 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.748228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kserve-provision-location\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.748497 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.748480 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfe17efd-10c4-48fe-a81d-9f7ac372748a-isvc-logger-raw-befab-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:45.757378 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:45.757312 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vw86\" (UniqueName: \"kubernetes.io/projected/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kube-api-access-7vw86\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:46.252555 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:46.252523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfe17efd-10c4-48fe-a81d-9f7ac372748a-proxy-tls\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:46.255179 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:46.255161 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfe17efd-10c4-48fe-a81d-9f7ac372748a-proxy-tls\") pod \"isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:46.421532 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:46.421488 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:46.552723 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:46.552696 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz"] Apr 23 18:11:46.555350 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:11:46.555321 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfe17efd_10c4_48fe_a81d_9f7ac372748a.slice/crio-b833faca6fb570e49edaa56c58c5ad807c502fa30145303ce0aae7388fad6610 WatchSource:0}: Error finding container b833faca6fb570e49edaa56c58c5ad807c502fa30145303ce0aae7388fad6610: Status 404 returned error can't find the container with id b833faca6fb570e49edaa56c58c5ad807c502fa30145303ce0aae7388fad6610 Apr 23 18:11:47.515337 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:47.515299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerStarted","Data":"8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398"} Apr 23 18:11:47.515337 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:47.515333 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerStarted","Data":"b833faca6fb570e49edaa56c58c5ad807c502fa30145303ce0aae7388fad6610"} Apr 23 18:11:50.526614 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:50.526568 2571 generic.go:358] "Generic (PLEG): container finished" podID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerID="8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398" exitCode=0 Apr 23 18:11:50.526614 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:50.526613 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerDied","Data":"8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398"} Apr 23 18:11:51.530870 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:51.530829 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerStarted","Data":"fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17"} Apr 23 18:11:51.530870 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:51.530876 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerStarted","Data":"c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39"} Apr 23 18:11:51.531307 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:51.530886 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerStarted","Data":"ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c"} Apr 23 18:11:51.531307 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:51.531180 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:51.531406 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:51.531311 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:51.532304 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:51.532274 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:11:51.552237 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:51.552186 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podStartSLOduration=6.552171113 podStartE2EDuration="6.552171113s" podCreationTimestamp="2026-04-23 18:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:11:51.551341381 +0000 UTC m=+1847.629913370" watchObservedRunningTime="2026-04-23 18:11:51.552171113 +0000 UTC m=+1847.630743090" Apr 23 18:11:52.534396 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:52.534357 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:52.534873 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:52.534482 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:11:52.535416 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:52.535387 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:11:53.537176 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:53.537120 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:11:53.537598 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:53.537575 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:11:58.541181 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:58.541148 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:11:58.541893 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:58.541851 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:11:58.542290 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:11:58.542268 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:08.542009 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:08.541967 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:12:08.542451 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:08.542427 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:18.542155 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:18.542097 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:12:18.542600 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:18.542573 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:28.541838 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:28.541718 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:12:28.542271 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:28.542140 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:38.541820 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:38.541744 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:12:38.542356 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:38.542253 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:48.542132 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:48.542084 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:12:48.542602 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:48.542578 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:58.542677 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:58.542627 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:12:58.543147 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:12:58.543070 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:13:08.542729 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:08.542699 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:13:08.543169 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:08.542890 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:13:20.496950 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.496921 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-befab-predictor-7b987ddd48-tqq62_d64a7c8b-3580-449e-80d1-a32d922e434b/kserve-container/0.log" Apr 23 18:13:20.695335 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.695299 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz"] Apr 23 18:13:20.695745 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.695642 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" containerID="cri-o://ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c" gracePeriod=30 Apr 23 18:13:20.695745 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.695672 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" containerID="cri-o://c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39" gracePeriod=30 Apr 23 18:13:20.695958 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.695663 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" containerID="cri-o://fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17" gracePeriod=30 Apr 23 18:13:20.795612 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.795580 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c"] Apr 23 18:13:20.799021 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.798998 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.801073 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.801049 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\"" Apr 23 18:13:20.801236 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.801090 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-42261-predictor-serving-cert\"" Apr 23 18:13:20.803789 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.803746 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62"] Apr 23 18:13:20.804101 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.804067 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" podUID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerName="kserve-container" containerID="cri-o://6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4" gracePeriod=30 Apr 23 18:13:20.804203 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.804152 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" podUID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerName="kube-rbac-proxy" containerID="cri-o://0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f" gracePeriod=30 Apr 23 18:13:20.810980 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.810923 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c"] Apr 23 18:13:20.853075 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.853040 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpp5b\" (UniqueName: \"kubernetes.io/projected/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kube-api-access-vpp5b\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.853259 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.853102 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1be91e75-95e3-4216-b4a5-61e5a049ff8c-isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.853259 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.853170 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.853259 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.853218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1be91e75-95e3-4216-b4a5-61e5a049ff8c-proxy-tls\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.953841 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.953803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpp5b\" (UniqueName: \"kubernetes.io/projected/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kube-api-access-vpp5b\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.954009 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.953858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1be91e75-95e3-4216-b4a5-61e5a049ff8c-isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.954009 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.953894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.954009 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.953917 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1be91e75-95e3-4216-b4a5-61e5a049ff8c-proxy-tls\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.954485 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.954458 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.954754 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.954729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1be91e75-95e3-4216-b4a5-61e5a049ff8c-isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.956758 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.956729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1be91e75-95e3-4216-b4a5-61e5a049ff8c-proxy-tls\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:20.962896 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:20.962866 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpp5b\" (UniqueName: \"kubernetes.io/projected/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kube-api-access-vpp5b\") pod \"isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:21.045712 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.045658 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:13:21.112964 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.112911 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:21.156035 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.156001 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d64a7c8b-3580-449e-80d1-a32d922e434b-proxy-tls\") pod \"d64a7c8b-3580-449e-80d1-a32d922e434b\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " Apr 23 18:13:21.156194 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.156057 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qxqr\" (UniqueName: \"kubernetes.io/projected/d64a7c8b-3580-449e-80d1-a32d922e434b-kube-api-access-7qxqr\") pod \"d64a7c8b-3580-449e-80d1-a32d922e434b\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " Apr 23 18:13:21.156194 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.156169 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d64a7c8b-3580-449e-80d1-a32d922e434b-message-dumper-raw-befab-kube-rbac-proxy-sar-config\") pod \"d64a7c8b-3580-449e-80d1-a32d922e434b\" (UID: \"d64a7c8b-3580-449e-80d1-a32d922e434b\") " Apr 23 18:13:21.156744 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.156697 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d64a7c8b-3580-449e-80d1-a32d922e434b-message-dumper-raw-befab-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-befab-kube-rbac-proxy-sar-config") pod "d64a7c8b-3580-449e-80d1-a32d922e434b" (UID: "d64a7c8b-3580-449e-80d1-a32d922e434b"). InnerVolumeSpecName "message-dumper-raw-befab-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:21.159439 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.158869 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64a7c8b-3580-449e-80d1-a32d922e434b-kube-api-access-7qxqr" (OuterVolumeSpecName: "kube-api-access-7qxqr") pod "d64a7c8b-3580-449e-80d1-a32d922e434b" (UID: "d64a7c8b-3580-449e-80d1-a32d922e434b"). InnerVolumeSpecName "kube-api-access-7qxqr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:13:21.160283 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.160227 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64a7c8b-3580-449e-80d1-a32d922e434b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d64a7c8b-3580-449e-80d1-a32d922e434b" (UID: "d64a7c8b-3580-449e-80d1-a32d922e434b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:13:21.239378 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.239346 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c"] Apr 23 18:13:21.242243 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:13:21.242216 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1be91e75_95e3_4216_b4a5_61e5a049ff8c.slice/crio-4bdc7b92a7df3029e89e921c0dc1faae7749c15efc67a9b1b3d5219a569fbf5e WatchSource:0}: Error finding container 4bdc7b92a7df3029e89e921c0dc1faae7749c15efc67a9b1b3d5219a569fbf5e: Status 404 returned error can't find the container with id 4bdc7b92a7df3029e89e921c0dc1faae7749c15efc67a9b1b3d5219a569fbf5e Apr 23 18:13:21.256888 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.256867 2571 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d64a7c8b-3580-449e-80d1-a32d922e434b-message-dumper-raw-befab-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:13:21.256888 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.256888 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d64a7c8b-3580-449e-80d1-a32d922e434b-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:13:21.257012 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.256902 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qxqr\" (UniqueName: \"kubernetes.io/projected/d64a7c8b-3580-449e-80d1-a32d922e434b-kube-api-access-7qxqr\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:13:21.776918 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.776880 2571 generic.go:358] "Generic (PLEG): container finished" podID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerID="0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f" exitCode=2 Apr 23 18:13:21.776918 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.776908 2571 generic.go:358] "Generic (PLEG): container finished" podID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerID="6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4" exitCode=2 Apr 23 18:13:21.777441 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.776964 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" event={"ID":"d64a7c8b-3580-449e-80d1-a32d922e434b","Type":"ContainerDied","Data":"0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f"} Apr 23 18:13:21.777441 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.777004 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" event={"ID":"d64a7c8b-3580-449e-80d1-a32d922e434b","Type":"ContainerDied","Data":"6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4"} Apr 23 18:13:21.777441 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.777015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" event={"ID":"d64a7c8b-3580-449e-80d1-a32d922e434b","Type":"ContainerDied","Data":"1604ec5a8b82ee19de3e51c94825286fe3d5463647beb86f8a08c4c412d78fd0"} Apr 23 18:13:21.777441 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.777029 2571 scope.go:117] "RemoveContainer" containerID="0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f" Apr 23 18:13:21.777441 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.777064 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62" Apr 23 18:13:21.778711 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.778666 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" event={"ID":"1be91e75-95e3-4216-b4a5-61e5a049ff8c","Type":"ContainerStarted","Data":"bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615"} Apr 23 18:13:21.778711 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.778704 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" event={"ID":"1be91e75-95e3-4216-b4a5-61e5a049ff8c","Type":"ContainerStarted","Data":"4bdc7b92a7df3029e89e921c0dc1faae7749c15efc67a9b1b3d5219a569fbf5e"} Apr 23 18:13:21.781058 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.781029 2571 generic.go:358] "Generic (PLEG): container finished" podID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerID="c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39" exitCode=2 Apr 23 18:13:21.781201 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.781077 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerDied","Data":"c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39"} Apr 23 18:13:21.786567 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.786550 2571 scope.go:117] "RemoveContainer" containerID="6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4" Apr 23 18:13:21.795466 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.795448 2571 scope.go:117] "RemoveContainer" containerID="0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f" Apr 23 18:13:21.795746 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:13:21.795729 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f\": container with ID starting with 0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f not found: ID does not exist" containerID="0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f" Apr 23 18:13:21.795820 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.795756 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f"} err="failed to get container status \"0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f\": rpc error: code = NotFound desc = could not find container \"0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f\": container with ID starting with 0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f not found: ID does not exist" Apr 23 18:13:21.795820 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.795794 2571 scope.go:117] "RemoveContainer" containerID="6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4" Apr 23 18:13:21.796071 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:13:21.796052 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4\": container with ID starting with 6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4 not found: ID does not exist" containerID="6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4" Apr 23 18:13:21.796133 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.796082 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4"} err="failed to get container status \"6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4\": rpc error: code = NotFound desc = could not find container \"6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4\": container with ID starting with 6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4 not found: ID does not exist" Apr 23 18:13:21.796133 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.796107 2571 scope.go:117] "RemoveContainer" containerID="0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f" Apr 23 18:13:21.796341 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.796324 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f"} err="failed to get container status \"0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f\": rpc error: code = NotFound desc = could not find container \"0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f\": container with ID starting with 0c76afc2aef8cdd22f1029e8917eb1282e11d32ba4a2c3af8773d69067dcc60f not found: ID does not exist" Apr 23 18:13:21.796395 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.796342 2571 scope.go:117] "RemoveContainer" containerID="6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4" Apr 23 18:13:21.796527 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.796511 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4"} err="failed to get container status \"6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4\": rpc error: code = NotFound desc = could not find container \"6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4\": container with ID starting with 6b09bde48098a2414cb307d337e45697abee389bc368260bf3d5fce5ef1160c4 not found: ID does not exist" Apr 23 18:13:21.814237 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.814205 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62"] Apr 23 18:13:21.817662 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:21.817636 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-befab-predictor-7b987ddd48-tqq62"] Apr 23 18:13:22.492036 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:22.492002 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d64a7c8b-3580-449e-80d1-a32d922e434b" path="/var/lib/kubelet/pods/d64a7c8b-3580-449e-80d1-a32d922e434b/volumes" Apr 23 18:13:23.538142 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:23.538097 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 23 18:13:25.795561 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:25.795525 2571 generic.go:358] "Generic (PLEG): container finished" podID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerID="bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615" exitCode=0 Apr 23 18:13:25.795959 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:25.795610 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" event={"ID":"1be91e75-95e3-4216-b4a5-61e5a049ff8c","Type":"ContainerDied","Data":"bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615"} Apr 23 18:13:26.801127 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:26.801091 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" event={"ID":"1be91e75-95e3-4216-b4a5-61e5a049ff8c","Type":"ContainerStarted","Data":"4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6"} Apr 23 18:13:26.801513 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:26.801137 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" event={"ID":"1be91e75-95e3-4216-b4a5-61e5a049ff8c","Type":"ContainerStarted","Data":"b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f"} Apr 23 18:13:26.801513 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:26.801345 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:26.820914 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:26.820858 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podStartSLOduration=6.820843247 podStartE2EDuration="6.820843247s" podCreationTimestamp="2026-04-23 18:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:13:26.81946313 +0000 UTC m=+1942.898035105" watchObservedRunningTime="2026-04-23 18:13:26.820843247 +0000 UTC m=+1942.899415219" Apr 23 18:13:27.804851 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:27.804814 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:27.805949 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:27.805922 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:13:28.538167 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:28.538125 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 23 18:13:28.542527 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:28.542494 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:13:28.542932 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:28.542908 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:13:28.808855 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:28.808816 2571 generic.go:358] "Generic (PLEG): container finished" podID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerID="ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c" exitCode=0 Apr 23 18:13:28.809348 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:28.808889 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerDied","Data":"ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c"} Apr 23 18:13:28.809348 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:28.809254 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:13:33.537604 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:33.537559 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 23 18:13:33.538050 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:33.537706 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:13:33.813736 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:33.813654 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:13:33.814168 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:33.814143 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:13:38.537764 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:38.537720 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 23 18:13:38.542163 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:38.542128 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:13:38.542484 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:38.542462 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:13:43.537695 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:43.537644 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 23 18:13:43.814883 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:43.814798 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:13:48.538039 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:48.537993 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 23 18:13:48.542396 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:48.542364 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 23 18:13:48.542548 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:48.542531 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:13:48.542816 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:48.542793 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:13:48.542921 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:48.542906 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:13:50.840499 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.840472 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:13:50.872276 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.872244 2571 generic.go:358] "Generic (PLEG): container finished" podID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerID="fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17" exitCode=0 Apr 23 18:13:50.872448 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.872323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerDied","Data":"fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17"} Apr 23 18:13:50.872448 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.872344 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" Apr 23 18:13:50.872448 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.872367 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz" event={"ID":"cfe17efd-10c4-48fe-a81d-9f7ac372748a","Type":"ContainerDied","Data":"b833faca6fb570e49edaa56c58c5ad807c502fa30145303ce0aae7388fad6610"} Apr 23 18:13:50.872448 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.872387 2571 scope.go:117] "RemoveContainer" containerID="fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17" Apr 23 18:13:50.881958 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.881934 2571 scope.go:117] "RemoveContainer" containerID="c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39" Apr 23 18:13:50.889029 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.888998 2571 scope.go:117] "RemoveContainer" containerID="ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c" Apr 23 18:13:50.896142 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.896122 2571 scope.go:117] "RemoveContainer" containerID="8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398" Apr 23 18:13:50.903409 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.903381 2571 scope.go:117] "RemoveContainer" containerID="fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17" Apr 23 18:13:50.903656 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:13:50.903635 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17\": container with ID starting with fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17 not found: ID does not exist" containerID="fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17" Apr 23 18:13:50.903731 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.903670 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17"} err="failed to get container status \"fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17\": rpc error: code = NotFound desc = could not find container \"fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17\": container with ID starting with fe1b5e94cf1e0c654e3610c557cdbc97001248f1f4654a94b777e982bcbc8d17 not found: ID does not exist" Apr 23 18:13:50.903731 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.903699 2571 scope.go:117] "RemoveContainer" containerID="c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39" Apr 23 18:13:50.903957 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:13:50.903940 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39\": container with ID starting with c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39 not found: ID does not exist" containerID="c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39" Apr 23 18:13:50.904018 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.903965 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39"} err="failed to get container status \"c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39\": rpc error: code = NotFound desc = could not find container \"c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39\": container with ID starting with c829a4c98049eecd7c5f50a28ac2e7aa1b2756f141ba2890e0d34f84cd036b39 not found: ID does not exist" Apr 23 18:13:50.904018 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.903989 2571 scope.go:117] "RemoveContainer" containerID="ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c" Apr 23 18:13:50.904230 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:13:50.904211 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c\": container with ID starting with ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c not found: ID does not exist" containerID="ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c" Apr 23 18:13:50.904276 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.904238 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c"} err="failed to get container status \"ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c\": rpc error: code = NotFound desc = could not find container \"ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c\": container with ID starting with ecba9f8cf539e1c41530ff470015ca7c8072ddd79b99fd9096b4890a14bc526c not found: ID does not exist" Apr 23 18:13:50.904276 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.904253 2571 scope.go:117] "RemoveContainer" containerID="8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398" Apr 23 18:13:50.904475 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:13:50.904457 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398\": container with ID starting with 8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398 not found: ID does not exist" containerID="8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398" Apr 23 18:13:50.904516 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.904483 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398"} err="failed to get container status \"8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398\": rpc error: code = NotFound desc = could not find container \"8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398\": container with ID starting with 8c98e67e7ca2298be685ecea81656e5ad6e5babe886bb22df418b2d8e17fe398 not found: ID does not exist" Apr 23 18:13:50.989348 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.989261 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vw86\" (UniqueName: \"kubernetes.io/projected/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kube-api-access-7vw86\") pod \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " Apr 23 18:13:50.989348 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.989301 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kserve-provision-location\") pod \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " Apr 23 18:13:50.989348 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.989343 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfe17efd-10c4-48fe-a81d-9f7ac372748a-isvc-logger-raw-befab-kube-rbac-proxy-sar-config\") pod \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " Apr 23 18:13:50.989639 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.989485 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfe17efd-10c4-48fe-a81d-9f7ac372748a-proxy-tls\") pod \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\" (UID: \"cfe17efd-10c4-48fe-a81d-9f7ac372748a\") " Apr 23 18:13:50.989703 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.989671 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cfe17efd-10c4-48fe-a81d-9f7ac372748a" (UID: "cfe17efd-10c4-48fe-a81d-9f7ac372748a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:13:50.989749 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.989707 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe17efd-10c4-48fe-a81d-9f7ac372748a-isvc-logger-raw-befab-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-befab-kube-rbac-proxy-sar-config") pod "cfe17efd-10c4-48fe-a81d-9f7ac372748a" (UID: "cfe17efd-10c4-48fe-a81d-9f7ac372748a"). InnerVolumeSpecName "isvc-logger-raw-befab-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:50.991617 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.991591 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe17efd-10c4-48fe-a81d-9f7ac372748a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cfe17efd-10c4-48fe-a81d-9f7ac372748a" (UID: "cfe17efd-10c4-48fe-a81d-9f7ac372748a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:13:50.991722 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:50.991638 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kube-api-access-7vw86" (OuterVolumeSpecName: "kube-api-access-7vw86") pod "cfe17efd-10c4-48fe-a81d-9f7ac372748a" (UID: "cfe17efd-10c4-48fe-a81d-9f7ac372748a"). InnerVolumeSpecName "kube-api-access-7vw86". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:13:51.090904 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:51.090868 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vw86\" (UniqueName: \"kubernetes.io/projected/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kube-api-access-7vw86\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:13:51.090904 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:51.090902 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfe17efd-10c4-48fe-a81d-9f7ac372748a-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:13:51.091138 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:51.090922 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-befab-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cfe17efd-10c4-48fe-a81d-9f7ac372748a-isvc-logger-raw-befab-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:13:51.091138 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:51.090935 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfe17efd-10c4-48fe-a81d-9f7ac372748a-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:13:51.196815 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:51.196762 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz"] Apr 23 18:13:51.201205 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:51.201179 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-befab-predictor-76d9f9bbb8-64kcz"] Apr 23 18:13:52.492211 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:52.492177 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" path="/var/lib/kubelet/pods/cfe17efd-10c4-48fe-a81d-9f7ac372748a/volumes" Apr 23 18:13:53.814504 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:13:53.814418 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:14:03.814555 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:14:03.814514 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:14:13.814200 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:14:13.814157 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:14:23.814264 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:14:23.814225 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:14:33.814546 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:14:33.814503 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:14:43.814466 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:14:43.814419 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:14:53.814375 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:14:53.814330 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:15:00.488840 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:00.488797 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:15:10.488933 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:10.488881 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:15:20.488882 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:20.488844 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:15:30.489610 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:30.489514 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:15:40.488971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:40.488924 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 18:15:50.491909 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:50.491880 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:15:50.967125 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:50.967084 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c"] Apr 23 18:15:51.118218 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118174 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln"] Apr 23 18:15:51.118508 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118492 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="storage-initializer" Apr 23 18:15:51.118592 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118512 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="storage-initializer" Apr 23 18:15:51.118592 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118528 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" Apr 23 18:15:51.118592 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118536 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" Apr 23 18:15:51.118592 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118549 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" Apr 23 18:15:51.118592 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118558 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" Apr 23 18:15:51.118592 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118576 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" Apr 23 18:15:51.118592 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118584 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" Apr 23 18:15:51.118971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118600 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerName="kube-rbac-proxy" Apr 23 18:15:51.118971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118609 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerName="kube-rbac-proxy" Apr 23 18:15:51.118971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118623 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerName="kserve-container" Apr 23 18:15:51.118971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118631 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerName="kserve-container" Apr 23 18:15:51.118971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118709 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kube-rbac-proxy" Apr 23 18:15:51.118971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118725 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="kserve-container" Apr 23 18:15:51.118971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118735 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerName="kube-rbac-proxy" Apr 23 18:15:51.118971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118746 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfe17efd-10c4-48fe-a81d-9f7ac372748a" containerName="agent" Apr 23 18:15:51.118971 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.118755 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d64a7c8b-3580-449e-80d1-a32d922e434b" containerName="kserve-container" Apr 23 18:15:51.121900 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.121879 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.124859 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.124839 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-93cf55-predictor-serving-cert\"" Apr 23 18:15:51.124944 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.124912 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-93cf55-kube-rbac-proxy-sar-config\"" Apr 23 18:15:51.142222 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.142197 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln"] Apr 23 18:15:51.177395 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.177351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjhv8\" (UniqueName: \"kubernetes.io/projected/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kube-api-access-xjhv8\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.177395 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.177392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kserve-provision-location\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.177597 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.177421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/57b28371-a97a-4cf1-83dd-3f576ed34e1f-isvc-primary-93cf55-kube-rbac-proxy-sar-config\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.177597 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.177469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57b28371-a97a-4cf1-83dd-3f576ed34e1f-proxy-tls\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.200645 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.200607 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" containerID="cri-o://b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f" gracePeriod=30 Apr 23 18:15:51.200827 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.200625 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kube-rbac-proxy" containerID="cri-o://4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6" gracePeriod=30 Apr 23 18:15:51.278811 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.278740 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/57b28371-a97a-4cf1-83dd-3f576ed34e1f-isvc-primary-93cf55-kube-rbac-proxy-sar-config\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.278997 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.278827 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57b28371-a97a-4cf1-83dd-3f576ed34e1f-proxy-tls\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.278997 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.278879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjhv8\" (UniqueName: \"kubernetes.io/projected/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kube-api-access-xjhv8\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.278997 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.278898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kserve-provision-location\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.279246 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.279223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kserve-provision-location\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.279437 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.279410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/57b28371-a97a-4cf1-83dd-3f576ed34e1f-isvc-primary-93cf55-kube-rbac-proxy-sar-config\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.281417 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.281398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57b28371-a97a-4cf1-83dd-3f576ed34e1f-proxy-tls\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.295782 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.295740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjhv8\" (UniqueName: \"kubernetes.io/projected/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kube-api-access-xjhv8\") pod \"isvc-primary-93cf55-predictor-84b5796f4b-thmln\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.431201 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.431167 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:51.557136 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.557048 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln"] Apr 23 18:15:51.560288 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:15:51.560261 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b28371_a97a_4cf1_83dd_3f576ed34e1f.slice/crio-6ebdaa957cf8ce0a084a7d462116177b596f33027e8f4c3da793344b427c9f15 WatchSource:0}: Error finding container 6ebdaa957cf8ce0a084a7d462116177b596f33027e8f4c3da793344b427c9f15: Status 404 returned error can't find the container with id 6ebdaa957cf8ce0a084a7d462116177b596f33027e8f4c3da793344b427c9f15 Apr 23 18:15:51.562161 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:51.562143 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:15:52.204207 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:52.204171 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" event={"ID":"57b28371-a97a-4cf1-83dd-3f576ed34e1f","Type":"ContainerStarted","Data":"678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a"} Apr 23 18:15:52.204207 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:52.204211 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" event={"ID":"57b28371-a97a-4cf1-83dd-3f576ed34e1f","Type":"ContainerStarted","Data":"6ebdaa957cf8ce0a084a7d462116177b596f33027e8f4c3da793344b427c9f15"} Apr 23 18:15:52.206020 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:52.205991 2571 generic.go:358] "Generic (PLEG): container finished" podID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerID="4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6" exitCode=2 Apr 23 18:15:52.206147 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:52.206072 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" event={"ID":"1be91e75-95e3-4216-b4a5-61e5a049ff8c","Type":"ContainerDied","Data":"4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6"} Apr 23 18:15:53.809469 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:53.809425 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 23 18:15:56.218367 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:56.218334 2571 generic.go:358] "Generic (PLEG): container finished" podID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerID="678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a" exitCode=0 Apr 23 18:15:56.218756 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:56.218416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" event={"ID":"57b28371-a97a-4cf1-83dd-3f576ed34e1f","Type":"ContainerDied","Data":"678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a"} Apr 23 18:15:57.223102 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:57.223062 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" event={"ID":"57b28371-a97a-4cf1-83dd-3f576ed34e1f","Type":"ContainerStarted","Data":"888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec"} Apr 23 18:15:57.223481 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:57.223112 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" event={"ID":"57b28371-a97a-4cf1-83dd-3f576ed34e1f","Type":"ContainerStarted","Data":"627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d"} Apr 23 18:15:57.223481 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:57.223437 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:57.223598 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:57.223581 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:15:57.224834 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:57.224802 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:15:57.261073 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:57.260987 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podStartSLOduration=6.26097363 podStartE2EDuration="6.26097363s" podCreationTimestamp="2026-04-23 18:15:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:15:57.26059587 +0000 UTC m=+2093.339167881" watchObservedRunningTime="2026-04-23 18:15:57.26097363 +0000 UTC m=+2093.339545608" Apr 23 18:15:58.225756 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:58.225715 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:15:58.809758 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:15:58.809718 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 23 18:16:00.540794 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.540752 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:16:00.645325 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.645288 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1be91e75-95e3-4216-b4a5-61e5a049ff8c-proxy-tls\") pod \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " Apr 23 18:16:00.645496 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.645340 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1be91e75-95e3-4216-b4a5-61e5a049ff8c-isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\") pod \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " Apr 23 18:16:00.645496 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.645458 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kserve-provision-location\") pod \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " Apr 23 18:16:00.645571 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.645496 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpp5b\" (UniqueName: \"kubernetes.io/projected/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kube-api-access-vpp5b\") pod \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\" (UID: \"1be91e75-95e3-4216-b4a5-61e5a049ff8c\") " Apr 23 18:16:00.645737 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.645711 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1be91e75-95e3-4216-b4a5-61e5a049ff8c-isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config") pod "1be91e75-95e3-4216-b4a5-61e5a049ff8c" (UID: "1be91e75-95e3-4216-b4a5-61e5a049ff8c"). InnerVolumeSpecName "isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:16:00.645835 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.645750 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1be91e75-95e3-4216-b4a5-61e5a049ff8c" (UID: "1be91e75-95e3-4216-b4a5-61e5a049ff8c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:16:00.647671 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.647606 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be91e75-95e3-4216-b4a5-61e5a049ff8c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1be91e75-95e3-4216-b4a5-61e5a049ff8c" (UID: "1be91e75-95e3-4216-b4a5-61e5a049ff8c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:16:00.647671 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.647654 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kube-api-access-vpp5b" (OuterVolumeSpecName: "kube-api-access-vpp5b") pod "1be91e75-95e3-4216-b4a5-61e5a049ff8c" (UID: "1be91e75-95e3-4216-b4a5-61e5a049ff8c"). InnerVolumeSpecName "kube-api-access-vpp5b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:16:00.746362 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.746332 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1be91e75-95e3-4216-b4a5-61e5a049ff8c-isvc-sklearn-scale-raw-42261-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:16:00.746362 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.746362 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:16:00.746548 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.746374 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vpp5b\" (UniqueName: \"kubernetes.io/projected/1be91e75-95e3-4216-b4a5-61e5a049ff8c-kube-api-access-vpp5b\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:16:00.746548 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:00.746383 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1be91e75-95e3-4216-b4a5-61e5a049ff8c-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:16:01.238886 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.238852 2571 generic.go:358] "Generic (PLEG): container finished" podID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerID="b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f" exitCode=0 Apr 23 18:16:01.239061 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.238938 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" Apr 23 18:16:01.239061 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.238958 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" event={"ID":"1be91e75-95e3-4216-b4a5-61e5a049ff8c","Type":"ContainerDied","Data":"b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f"} Apr 23 18:16:01.239061 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.238990 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" event={"ID":"1be91e75-95e3-4216-b4a5-61e5a049ff8c","Type":"ContainerDied","Data":"4bdc7b92a7df3029e89e921c0dc1faae7749c15efc67a9b1b3d5219a569fbf5e"} Apr 23 18:16:01.239061 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.239006 2571 scope.go:117] "RemoveContainer" containerID="4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6" Apr 23 18:16:01.247434 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.247411 2571 scope.go:117] "RemoveContainer" containerID="b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f" Apr 23 18:16:01.254461 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.254443 2571 scope.go:117] "RemoveContainer" containerID="bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615" Apr 23 18:16:01.261267 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.261240 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c"] Apr 23 18:16:01.261600 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.261581 2571 scope.go:117] "RemoveContainer" containerID="4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6" Apr 23 18:16:01.261957 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:16:01.261925 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6\": container with ID starting with 4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6 not found: ID does not exist" containerID="4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6" Apr 23 18:16:01.262051 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.261970 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6"} err="failed to get container status \"4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6\": rpc error: code = NotFound desc = could not find container \"4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6\": container with ID starting with 4e7be6b011e467f79abdf02a9e7f77f46332b869b27b6108292111772397e0a6 not found: ID does not exist" Apr 23 18:16:01.262051 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.261998 2571 scope.go:117] "RemoveContainer" containerID="b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f" Apr 23 18:16:01.262269 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:16:01.262251 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f\": container with ID starting with b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f not found: ID does not exist" containerID="b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f" Apr 23 18:16:01.262308 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.262282 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f"} err="failed to get container status \"b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f\": rpc error: code = NotFound desc = could not find container \"b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f\": container with ID starting with b29f576b2394ef1e7ddb327b4e6a47cd778f61df9cfd5020e809a036e0b49b5f not found: ID does not exist" Apr 23 18:16:01.262308 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.262300 2571 scope.go:117] "RemoveContainer" containerID="bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615" Apr 23 18:16:01.262567 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:16:01.262550 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615\": container with ID starting with bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615 not found: ID does not exist" containerID="bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615" Apr 23 18:16:01.262609 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.262572 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615"} err="failed to get container status \"bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615\": rpc error: code = NotFound desc = could not find container \"bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615\": container with ID starting with bf110184e62274139a5ced0774ad2155c830a92f13ede6bfdcbf269d132ce615 not found: ID does not exist" Apr 23 18:16:01.264991 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.264974 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c"] Apr 23 18:16:01.489530 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:01.489433 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-42261-predictor-58bb868d67-q8t9c" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: i/o timeout" Apr 23 18:16:02.492442 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:02.492403 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" path="/var/lib/kubelet/pods/1be91e75-95e3-4216-b4a5-61e5a049ff8c/volumes" Apr 23 18:16:03.229603 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:03.229579 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:16:03.230202 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:03.230175 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:16:04.474863 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:04.474836 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:16:04.479492 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:04.479466 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:16:13.230319 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:13.230279 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:16:23.230819 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:23.230762 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:16:33.230193 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:33.230151 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:16:43.230133 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:43.230089 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:16:53.230991 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:16:53.230902 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:17:03.230150 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:03.230106 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:17:05.488339 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:05.488298 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 23 18:17:15.489005 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:15.488962 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:17:21.306879 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.306838 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt"] Apr 23 18:17:21.307374 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.307143 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="storage-initializer" Apr 23 18:17:21.307374 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.307156 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="storage-initializer" Apr 23 18:17:21.307374 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.307176 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" Apr 23 18:17:21.307374 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.307186 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" Apr 23 18:17:21.307374 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.307194 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kube-rbac-proxy" Apr 23 18:17:21.307374 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.307201 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kube-rbac-proxy" Apr 23 18:17:21.307374 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.307247 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kube-rbac-proxy" Apr 23 18:17:21.307374 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.307260 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1be91e75-95e3-4216-b4a5-61e5a049ff8c" containerName="kserve-container" Apr 23 18:17:21.310182 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.310164 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.312216 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.312191 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-93cf55-predictor-serving-cert\"" Apr 23 18:17:21.312346 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.312278 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-93cf55-kube-rbac-proxy-sar-config\"" Apr 23 18:17:21.312403 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.312278 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-93cf55\"" Apr 23 18:17:21.312493 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.312475 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 18:17:21.312624 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.312605 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-93cf55-dockercfg-z6ltp\"" Apr 23 18:17:21.322372 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.322349 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt"] Apr 23 18:17:21.443169 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.443131 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-isvc-secondary-93cf55-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.443169 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.443171 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kserve-provision-location\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.443376 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.443195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cacc9e94-0d77-40eb-94c3-2511206fe0fe-proxy-tls\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.443376 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.443259 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2cmm\" (UniqueName: \"kubernetes.io/projected/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kube-api-access-f2cmm\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.443376 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.443343 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-cabundle-cert\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.544676 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.544643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-cabundle-cert\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.544881 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.544702 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-isvc-secondary-93cf55-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.544881 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.544841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kserve-provision-location\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.545028 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.544908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cacc9e94-0d77-40eb-94c3-2511206fe0fe-proxy-tls\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.545028 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.544930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2cmm\" (UniqueName: \"kubernetes.io/projected/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kube-api-access-f2cmm\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.545284 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.545261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kserve-provision-location\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.545497 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.545477 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-isvc-secondary-93cf55-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.545599 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.545576 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-cabundle-cert\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.547570 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.547542 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cacc9e94-0d77-40eb-94c3-2511206fe0fe-proxy-tls\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.552874 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.552851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2cmm\" (UniqueName: \"kubernetes.io/projected/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kube-api-access-f2cmm\") pod \"isvc-secondary-93cf55-predictor-8bd9574b9-lblzt\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.620249 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.620167 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:21.746389 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:21.746357 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt"] Apr 23 18:17:21.749516 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:17:21.749484 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcacc9e94_0d77_40eb_94c3_2511206fe0fe.slice/crio-5d43137a7eb8418a32475b1f93e28be0cb151f6c0090bf5f1059bd4c4778410d WatchSource:0}: Error finding container 5d43137a7eb8418a32475b1f93e28be0cb151f6c0090bf5f1059bd4c4778410d: Status 404 returned error can't find the container with id 5d43137a7eb8418a32475b1f93e28be0cb151f6c0090bf5f1059bd4c4778410d Apr 23 18:17:22.455638 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:22.455596 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" event={"ID":"cacc9e94-0d77-40eb-94c3-2511206fe0fe","Type":"ContainerStarted","Data":"36efc7304b3235dfece05230e0ca70b672d0ab474fb2ecf44d9ca630b9c3f9f0"} Apr 23 18:17:22.455638 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:22.455631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" event={"ID":"cacc9e94-0d77-40eb-94c3-2511206fe0fe","Type":"ContainerStarted","Data":"5d43137a7eb8418a32475b1f93e28be0cb151f6c0090bf5f1059bd4c4778410d"} Apr 23 18:17:24.462531 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:24.462498 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_cacc9e94-0d77-40eb-94c3-2511206fe0fe/storage-initializer/0.log" Apr 23 18:17:24.462937 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:24.462538 2571 generic.go:358] "Generic (PLEG): container finished" podID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" containerID="36efc7304b3235dfece05230e0ca70b672d0ab474fb2ecf44d9ca630b9c3f9f0" exitCode=1 Apr 23 18:17:24.462937 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:24.462600 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" event={"ID":"cacc9e94-0d77-40eb-94c3-2511206fe0fe","Type":"ContainerDied","Data":"36efc7304b3235dfece05230e0ca70b672d0ab474fb2ecf44d9ca630b9c3f9f0"} Apr 23 18:17:25.466864 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:25.466835 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_cacc9e94-0d77-40eb-94c3-2511206fe0fe/storage-initializer/0.log" Apr 23 18:17:25.467233 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:25.466899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" event={"ID":"cacc9e94-0d77-40eb-94c3-2511206fe0fe","Type":"ContainerStarted","Data":"9c702ea3941a4ea9ce634b2c723d6fe00b4d6bb590d338e2f794682543020dc0"} Apr 23 18:17:30.480980 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:30.480949 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_cacc9e94-0d77-40eb-94c3-2511206fe0fe/storage-initializer/1.log" Apr 23 18:17:30.481418 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:30.481344 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_cacc9e94-0d77-40eb-94c3-2511206fe0fe/storage-initializer/0.log" Apr 23 18:17:30.481418 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:30.481377 2571 generic.go:358] "Generic (PLEG): container finished" podID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" containerID="9c702ea3941a4ea9ce634b2c723d6fe00b4d6bb590d338e2f794682543020dc0" exitCode=1 Apr 23 18:17:30.481489 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:30.481434 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" event={"ID":"cacc9e94-0d77-40eb-94c3-2511206fe0fe","Type":"ContainerDied","Data":"9c702ea3941a4ea9ce634b2c723d6fe00b4d6bb590d338e2f794682543020dc0"} Apr 23 18:17:30.481489 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:30.481468 2571 scope.go:117] "RemoveContainer" containerID="36efc7304b3235dfece05230e0ca70b672d0ab474fb2ecf44d9ca630b9c3f9f0" Apr 23 18:17:30.481891 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:30.481875 2571 scope.go:117] "RemoveContainer" containerID="36efc7304b3235dfece05230e0ca70b672d0ab474fb2ecf44d9ca630b9c3f9f0" Apr 23 18:17:30.492384 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:17:30.492352 2571 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_kserve-ci-e2e-test_cacc9e94-0d77-40eb-94c3-2511206fe0fe_0 in pod sandbox 5d43137a7eb8418a32475b1f93e28be0cb151f6c0090bf5f1059bd4c4778410d from index: no such id: '36efc7304b3235dfece05230e0ca70b672d0ab474fb2ecf44d9ca630b9c3f9f0'" containerID="36efc7304b3235dfece05230e0ca70b672d0ab474fb2ecf44d9ca630b9c3f9f0" Apr 23 18:17:30.492450 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:30.492397 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36efc7304b3235dfece05230e0ca70b672d0ab474fb2ecf44d9ca630b9c3f9f0"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_kserve-ci-e2e-test_cacc9e94-0d77-40eb-94c3-2511206fe0fe_0 in pod sandbox 5d43137a7eb8418a32475b1f93e28be0cb151f6c0090bf5f1059bd4c4778410d from index: no such id: '36efc7304b3235dfece05230e0ca70b672d0ab474fb2ecf44d9ca630b9c3f9f0'" Apr 23 18:17:30.492579 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:17:30.492560 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_kserve-ci-e2e-test(cacc9e94-0d77-40eb-94c3-2511206fe0fe)\"" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" podUID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" Apr 23 18:17:31.485203 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:31.485173 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_cacc9e94-0d77-40eb-94c3-2511206fe0fe/storage-initializer/1.log" Apr 23 18:17:35.559890 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.559859 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt"] Apr 23 18:17:35.648740 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.648709 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln"] Apr 23 18:17:35.649116 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.649087 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" containerID="cri-o://627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d" gracePeriod=30 Apr 23 18:17:35.649253 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.649143 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kube-rbac-proxy" containerID="cri-o://888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec" gracePeriod=30 Apr 23 18:17:35.676950 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.676928 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr"] Apr 23 18:17:35.681502 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.681479 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.683429 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.683405 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-6c6f0e-predictor-serving-cert\"" Apr 23 18:17:35.683534 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.683478 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-6c6f0e-dockercfg-czq5m\"" Apr 23 18:17:35.683598 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.683538 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-6c6f0e\"" Apr 23 18:17:35.683598 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.683579 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\"" Apr 23 18:17:35.690705 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.690682 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr"] Apr 23 18:17:35.748553 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.748531 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s5t5\" (UniqueName: \"kubernetes.io/projected/77010218-e7eb-4d94-a9f0-466ca2b695f3-kube-api-access-8s5t5\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.748688 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.748562 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77010218-e7eb-4d94-a9f0-466ca2b695f3-kserve-provision-location\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.748688 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.748585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-cabundle-cert\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.748841 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.748681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77010218-e7eb-4d94-a9f0-466ca2b695f3-proxy-tls\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.748841 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.748716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.766746 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.766730 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_cacc9e94-0d77-40eb-94c3-2511206fe0fe/storage-initializer/1.log" Apr 23 18:17:35.766854 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.766828 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:35.849508 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.849439 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2cmm\" (UniqueName: \"kubernetes.io/projected/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kube-api-access-f2cmm\") pod \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " Apr 23 18:17:35.849508 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.849476 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-isvc-secondary-93cf55-kube-rbac-proxy-sar-config\") pod \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " Apr 23 18:17:35.849663 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.849512 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cacc9e94-0d77-40eb-94c3-2511206fe0fe-proxy-tls\") pod \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " Apr 23 18:17:35.849723 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.849701 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-cabundle-cert\") pod \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " Apr 23 18:17:35.849837 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.849820 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kserve-provision-location\") pod \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\" (UID: \"cacc9e94-0d77-40eb-94c3-2511206fe0fe\") " Apr 23 18:17:35.849920 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.849898 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-isvc-secondary-93cf55-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-93cf55-kube-rbac-proxy-sar-config") pod "cacc9e94-0d77-40eb-94c3-2511206fe0fe" (UID: "cacc9e94-0d77-40eb-94c3-2511206fe0fe"). InnerVolumeSpecName "isvc-secondary-93cf55-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:17:35.850010 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.849921 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77010218-e7eb-4d94-a9f0-466ca2b695f3-proxy-tls\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.850068 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:17:35.850016 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-serving-cert: secret "isvc-init-fail-6c6f0e-predictor-serving-cert" not found Apr 23 18:17:35.850068 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.850174 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:17:35.850075 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77010218-e7eb-4d94-a9f0-466ca2b695f3-proxy-tls podName:77010218-e7eb-4d94-a9f0-466ca2b695f3 nodeName:}" failed. No retries permitted until 2026-04-23 18:17:36.350055027 +0000 UTC m=+2192.428626983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/77010218-e7eb-4d94-a9f0-466ca2b695f3-proxy-tls") pod "isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" (UID: "77010218-e7eb-4d94-a9f0-466ca2b695f3") : secret "isvc-init-fail-6c6f0e-predictor-serving-cert" not found Apr 23 18:17:35.850174 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s5t5\" (UniqueName: \"kubernetes.io/projected/77010218-e7eb-4d94-a9f0-466ca2b695f3-kube-api-access-8s5t5\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.850174 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850137 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cacc9e94-0d77-40eb-94c3-2511206fe0fe" (UID: "cacc9e94-0d77-40eb-94c3-2511206fe0fe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:17:35.850174 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850147 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "cacc9e94-0d77-40eb-94c3-2511206fe0fe" (UID: "cacc9e94-0d77-40eb-94c3-2511206fe0fe"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:17:35.850369 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77010218-e7eb-4d94-a9f0-466ca2b695f3-kserve-provision-location\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.850369 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-cabundle-cert\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.850369 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850304 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-isvc-secondary-93cf55-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:35.850369 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850321 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cacc9e94-0d77-40eb-94c3-2511206fe0fe-cabundle-cert\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:35.850369 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850336 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:35.850556 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850518 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77010218-e7eb-4d94-a9f0-466ca2b695f3-kserve-provision-location\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.850735 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850711 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.850932 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.850914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-cabundle-cert\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.852251 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.852231 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kube-api-access-f2cmm" (OuterVolumeSpecName: "kube-api-access-f2cmm") pod "cacc9e94-0d77-40eb-94c3-2511206fe0fe" (UID: "cacc9e94-0d77-40eb-94c3-2511206fe0fe"). InnerVolumeSpecName "kube-api-access-f2cmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:17:35.852344 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.852265 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cacc9e94-0d77-40eb-94c3-2511206fe0fe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cacc9e94-0d77-40eb-94c3-2511206fe0fe" (UID: "cacc9e94-0d77-40eb-94c3-2511206fe0fe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:17:35.859248 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.859229 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s5t5\" (UniqueName: \"kubernetes.io/projected/77010218-e7eb-4d94-a9f0-466ca2b695f3-kube-api-access-8s5t5\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:35.951534 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.951503 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f2cmm\" (UniqueName: \"kubernetes.io/projected/cacc9e94-0d77-40eb-94c3-2511206fe0fe-kube-api-access-f2cmm\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:35.951534 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:35.951529 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cacc9e94-0d77-40eb-94c3-2511206fe0fe-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:36.354996 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.354956 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77010218-e7eb-4d94-a9f0-466ca2b695f3-proxy-tls\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:36.357517 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.357498 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77010218-e7eb-4d94-a9f0-466ca2b695f3-proxy-tls\") pod \"isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:36.500520 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.500489 2571 generic.go:358] "Generic (PLEG): container finished" podID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerID="888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec" exitCode=2 Apr 23 18:17:36.500652 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.500550 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" event={"ID":"57b28371-a97a-4cf1-83dd-3f576ed34e1f","Type":"ContainerDied","Data":"888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec"} Apr 23 18:17:36.501574 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.501554 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-93cf55-predictor-8bd9574b9-lblzt_cacc9e94-0d77-40eb-94c3-2511206fe0fe/storage-initializer/1.log" Apr 23 18:17:36.501693 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.501621 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" event={"ID":"cacc9e94-0d77-40eb-94c3-2511206fe0fe","Type":"ContainerDied","Data":"5d43137a7eb8418a32475b1f93e28be0cb151f6c0090bf5f1059bd4c4778410d"} Apr 23 18:17:36.501693 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.501646 2571 scope.go:117] "RemoveContainer" containerID="9c702ea3941a4ea9ce634b2c723d6fe00b4d6bb590d338e2f794682543020dc0" Apr 23 18:17:36.501693 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.501686 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt" Apr 23 18:17:36.535010 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.534981 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt"] Apr 23 18:17:36.539860 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.539835 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-93cf55-predictor-8bd9574b9-lblzt"] Apr 23 18:17:36.595978 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.595937 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:36.743231 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:36.743145 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr"] Apr 23 18:17:36.748086 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:17:36.748056 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77010218_e7eb_4d94_a9f0_466ca2b695f3.slice/crio-26e00affa42386a0e8e593fdf6ad45ebb8c06d8b11c746dc2dd4977f15a50355 WatchSource:0}: Error finding container 26e00affa42386a0e8e593fdf6ad45ebb8c06d8b11c746dc2dd4977f15a50355: Status 404 returned error can't find the container with id 26e00affa42386a0e8e593fdf6ad45ebb8c06d8b11c746dc2dd4977f15a50355 Apr 23 18:17:37.507136 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:37.507096 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" event={"ID":"77010218-e7eb-4d94-a9f0-466ca2b695f3","Type":"ContainerStarted","Data":"06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529"} Apr 23 18:17:37.507136 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:37.507138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" event={"ID":"77010218-e7eb-4d94-a9f0-466ca2b695f3","Type":"ContainerStarted","Data":"26e00affa42386a0e8e593fdf6ad45ebb8c06d8b11c746dc2dd4977f15a50355"} Apr 23 18:17:38.226021 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:38.225976 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.23:8643/healthz\": dial tcp 10.132.0.23:8643: connect: connection refused" Apr 23 18:17:38.491936 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:38.491861 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" path="/var/lib/kubelet/pods/cacc9e94-0d77-40eb-94c3-2511206fe0fe/volumes" Apr 23 18:17:40.087419 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.087394 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:17:40.184520 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.184423 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kserve-provision-location\") pod \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " Apr 23 18:17:40.184520 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.184459 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57b28371-a97a-4cf1-83dd-3f576ed34e1f-proxy-tls\") pod \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " Apr 23 18:17:40.184520 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.184486 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjhv8\" (UniqueName: \"kubernetes.io/projected/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kube-api-access-xjhv8\") pod \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " Apr 23 18:17:40.184829 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.184533 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/57b28371-a97a-4cf1-83dd-3f576ed34e1f-isvc-primary-93cf55-kube-rbac-proxy-sar-config\") pod \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\" (UID: \"57b28371-a97a-4cf1-83dd-3f576ed34e1f\") " Apr 23 18:17:40.184882 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.184854 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "57b28371-a97a-4cf1-83dd-3f576ed34e1f" (UID: "57b28371-a97a-4cf1-83dd-3f576ed34e1f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:17:40.184980 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.184954 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b28371-a97a-4cf1-83dd-3f576ed34e1f-isvc-primary-93cf55-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-93cf55-kube-rbac-proxy-sar-config") pod "57b28371-a97a-4cf1-83dd-3f576ed34e1f" (UID: "57b28371-a97a-4cf1-83dd-3f576ed34e1f"). InnerVolumeSpecName "isvc-primary-93cf55-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:17:40.186720 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.186697 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b28371-a97a-4cf1-83dd-3f576ed34e1f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "57b28371-a97a-4cf1-83dd-3f576ed34e1f" (UID: "57b28371-a97a-4cf1-83dd-3f576ed34e1f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:17:40.186836 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.186815 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kube-api-access-xjhv8" (OuterVolumeSpecName: "kube-api-access-xjhv8") pod "57b28371-a97a-4cf1-83dd-3f576ed34e1f" (UID: "57b28371-a97a-4cf1-83dd-3f576ed34e1f"). InnerVolumeSpecName "kube-api-access-xjhv8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:17:40.284998 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.284955 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-93cf55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/57b28371-a97a-4cf1-83dd-3f576ed34e1f-isvc-primary-93cf55-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:40.284998 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.284987 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:40.284998 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.284997 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57b28371-a97a-4cf1-83dd-3f576ed34e1f-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:40.285226 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.285013 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xjhv8\" (UniqueName: \"kubernetes.io/projected/57b28371-a97a-4cf1-83dd-3f576ed34e1f-kube-api-access-xjhv8\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:40.518285 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.518254 2571 generic.go:358] "Generic (PLEG): container finished" podID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerID="627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d" exitCode=0 Apr 23 18:17:40.518443 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.518341 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" Apr 23 18:17:40.518443 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.518338 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" event={"ID":"57b28371-a97a-4cf1-83dd-3f576ed34e1f","Type":"ContainerDied","Data":"627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d"} Apr 23 18:17:40.518517 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.518445 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln" event={"ID":"57b28371-a97a-4cf1-83dd-3f576ed34e1f","Type":"ContainerDied","Data":"6ebdaa957cf8ce0a084a7d462116177b596f33027e8f4c3da793344b427c9f15"} Apr 23 18:17:40.518517 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.518462 2571 scope.go:117] "RemoveContainer" containerID="888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec" Apr 23 18:17:40.526146 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.526120 2571 scope.go:117] "RemoveContainer" containerID="627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d" Apr 23 18:17:40.533432 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.533414 2571 scope.go:117] "RemoveContainer" containerID="678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a" Apr 23 18:17:40.535802 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.535759 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln"] Apr 23 18:17:40.540539 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.540516 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-93cf55-predictor-84b5796f4b-thmln"] Apr 23 18:17:40.541112 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.541097 2571 scope.go:117] "RemoveContainer" containerID="888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec" Apr 23 18:17:40.541368 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:17:40.541350 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec\": container with ID starting with 888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec not found: ID does not exist" containerID="888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec" Apr 23 18:17:40.541436 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.541382 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec"} err="failed to get container status \"888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec\": rpc error: code = NotFound desc = could not find container \"888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec\": container with ID starting with 888850ac3444f1afeb16f0e92e14e8411ebffa071a67b9c2d4816f59d4bf37ec not found: ID does not exist" Apr 23 18:17:40.541436 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.541410 2571 scope.go:117] "RemoveContainer" containerID="627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d" Apr 23 18:17:40.541664 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:17:40.541646 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d\": container with ID starting with 627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d not found: ID does not exist" containerID="627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d" Apr 23 18:17:40.541703 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.541672 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d"} err="failed to get container status \"627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d\": rpc error: code = NotFound desc = could not find container \"627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d\": container with ID starting with 627955dc838d09e27bc3e0495f3b843743130da93e996f21b4b2e6924a6a7e5d not found: ID does not exist" Apr 23 18:17:40.541703 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.541688 2571 scope.go:117] "RemoveContainer" containerID="678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a" Apr 23 18:17:40.541968 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:17:40.541952 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a\": container with ID starting with 678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a not found: ID does not exist" containerID="678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a" Apr 23 18:17:40.542008 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:40.541974 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a"} err="failed to get container status \"678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a\": rpc error: code = NotFound desc = could not find container \"678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a\": container with ID starting with 678aba6c348942b3f6bd5c6ab6cc5acdb1fc3c1bb080d393b826a61ce362b87a not found: ID does not exist" Apr 23 18:17:42.491703 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:42.491671 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" path="/var/lib/kubelet/pods/57b28371-a97a-4cf1-83dd-3f576ed34e1f/volumes" Apr 23 18:17:42.525710 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:42.525682 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr_77010218-e7eb-4d94-a9f0-466ca2b695f3/storage-initializer/0.log" Apr 23 18:17:42.525882 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:42.525722 2571 generic.go:358] "Generic (PLEG): container finished" podID="77010218-e7eb-4d94-a9f0-466ca2b695f3" containerID="06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529" exitCode=1 Apr 23 18:17:42.525882 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:42.525798 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" event={"ID":"77010218-e7eb-4d94-a9f0-466ca2b695f3","Type":"ContainerDied","Data":"06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529"} Apr 23 18:17:43.529849 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:43.529814 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr_77010218-e7eb-4d94-a9f0-466ca2b695f3/storage-initializer/0.log" Apr 23 18:17:43.530219 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:43.529890 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" event={"ID":"77010218-e7eb-4d94-a9f0-466ca2b695f3","Type":"ContainerStarted","Data":"845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8"} Apr 23 18:17:45.567958 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.567922 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr"] Apr 23 18:17:45.568371 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.568233 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" podUID="77010218-e7eb-4d94-a9f0-466ca2b695f3" containerName="storage-initializer" containerID="cri-o://845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8" gracePeriod=30 Apr 23 18:17:45.711879 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.711846 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp"] Apr 23 18:17:45.712122 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712109 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" Apr 23 18:17:45.712122 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712123 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" Apr 23 18:17:45.712198 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712136 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="storage-initializer" Apr 23 18:17:45.712198 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712144 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="storage-initializer" Apr 23 18:17:45.712198 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712154 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" containerName="storage-initializer" Apr 23 18:17:45.712198 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712160 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" containerName="storage-initializer" Apr 23 18:17:45.712198 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712168 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kube-rbac-proxy" Apr 23 18:17:45.712198 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712173 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kube-rbac-proxy" Apr 23 18:17:45.712198 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712180 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" containerName="storage-initializer" Apr 23 18:17:45.712198 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712185 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" containerName="storage-initializer" Apr 23 18:17:45.712419 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712220 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" containerName="storage-initializer" Apr 23 18:17:45.712419 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712228 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kserve-container" Apr 23 18:17:45.712419 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712236 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="57b28371-a97a-4cf1-83dd-3f576ed34e1f" containerName="kube-rbac-proxy" Apr 23 18:17:45.712419 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.712310 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cacc9e94-0d77-40eb-94c3-2511206fe0fe" containerName="storage-initializer" Apr 23 18:17:45.715174 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.715155 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.717306 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.717278 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-802ca-kube-rbac-proxy-sar-config\"" Apr 23 18:17:45.718638 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.718615 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rvdb2\"" Apr 23 18:17:45.718895 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.718881 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-802ca-predictor-serving-cert\"" Apr 23 18:17:45.728972 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.728947 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp"] Apr 23 18:17:45.826952 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.826856 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kserve-provision-location\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.826952 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.826912 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-proxy-tls\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.826952 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.826939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hh97\" (UniqueName: \"kubernetes.io/projected/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kube-api-access-4hh97\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.827183 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.827007 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-802ca-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-raw-sklearn-802ca-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.927737 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.927694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-802ca-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-raw-sklearn-802ca-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.927737 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.927741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kserve-provision-location\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.927981 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.927882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-proxy-tls\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.927981 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.927926 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hh97\" (UniqueName: \"kubernetes.io/projected/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kube-api-access-4hh97\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.928166 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.928124 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kserve-provision-location\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.928442 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.928425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-802ca-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-raw-sklearn-802ca-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.930348 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.930325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-proxy-tls\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:45.936094 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:45.936068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hh97\" (UniqueName: \"kubernetes.io/projected/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kube-api-access-4hh97\") pod \"raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:46.025107 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:46.025061 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:46.153852 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:46.153823 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp"] Apr 23 18:17:46.156116 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:17:46.156082 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5fd99c8_d7d6_4e45_8d27_33545cded6ee.slice/crio-c5ea8f8a7a00d21d8e6db22b570ca8d813624032c9d9cb6ec3266fc934d17f86 WatchSource:0}: Error finding container c5ea8f8a7a00d21d8e6db22b570ca8d813624032c9d9cb6ec3266fc934d17f86: Status 404 returned error can't find the container with id c5ea8f8a7a00d21d8e6db22b570ca8d813624032c9d9cb6ec3266fc934d17f86 Apr 23 18:17:46.540848 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:46.540811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" event={"ID":"b5fd99c8-d7d6-4e45-8d27-33545cded6ee","Type":"ContainerStarted","Data":"0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1"} Apr 23 18:17:46.541020 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:46.540856 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" event={"ID":"b5fd99c8-d7d6-4e45-8d27-33545cded6ee","Type":"ContainerStarted","Data":"c5ea8f8a7a00d21d8e6db22b570ca8d813624032c9d9cb6ec3266fc934d17f86"} Apr 23 18:17:48.103592 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.103570 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr_77010218-e7eb-4d94-a9f0-466ca2b695f3/storage-initializer/1.log" Apr 23 18:17:48.103964 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.103949 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr_77010218-e7eb-4d94-a9f0-466ca2b695f3/storage-initializer/0.log" Apr 23 18:17:48.104019 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.104013 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:48.145278 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.145237 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\") pod \"77010218-e7eb-4d94-a9f0-466ca2b695f3\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " Apr 23 18:17:48.145278 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.145283 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77010218-e7eb-4d94-a9f0-466ca2b695f3-proxy-tls\") pod \"77010218-e7eb-4d94-a9f0-466ca2b695f3\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " Apr 23 18:17:48.145485 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.145340 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77010218-e7eb-4d94-a9f0-466ca2b695f3-kserve-provision-location\") pod \"77010218-e7eb-4d94-a9f0-466ca2b695f3\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " Apr 23 18:17:48.145485 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.145374 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s5t5\" (UniqueName: \"kubernetes.io/projected/77010218-e7eb-4d94-a9f0-466ca2b695f3-kube-api-access-8s5t5\") pod \"77010218-e7eb-4d94-a9f0-466ca2b695f3\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " Apr 23 18:17:48.145485 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.145409 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-cabundle-cert\") pod \"77010218-e7eb-4d94-a9f0-466ca2b695f3\" (UID: \"77010218-e7eb-4d94-a9f0-466ca2b695f3\") " Apr 23 18:17:48.145673 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.145645 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77010218-e7eb-4d94-a9f0-466ca2b695f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "77010218-e7eb-4d94-a9f0-466ca2b695f3" (UID: "77010218-e7eb-4d94-a9f0-466ca2b695f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:17:48.145806 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.145760 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config") pod "77010218-e7eb-4d94-a9f0-466ca2b695f3" (UID: "77010218-e7eb-4d94-a9f0-466ca2b695f3"). InnerVolumeSpecName "isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:17:48.145889 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.145823 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "77010218-e7eb-4d94-a9f0-466ca2b695f3" (UID: "77010218-e7eb-4d94-a9f0-466ca2b695f3"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:17:48.147677 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.147650 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77010218-e7eb-4d94-a9f0-466ca2b695f3-kube-api-access-8s5t5" (OuterVolumeSpecName: "kube-api-access-8s5t5") pod "77010218-e7eb-4d94-a9f0-466ca2b695f3" (UID: "77010218-e7eb-4d94-a9f0-466ca2b695f3"). InnerVolumeSpecName "kube-api-access-8s5t5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:17:48.147677 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.147670 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77010218-e7eb-4d94-a9f0-466ca2b695f3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "77010218-e7eb-4d94-a9f0-466ca2b695f3" (UID: "77010218-e7eb-4d94-a9f0-466ca2b695f3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:17:48.245969 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.245885 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8s5t5\" (UniqueName: \"kubernetes.io/projected/77010218-e7eb-4d94-a9f0-466ca2b695f3-kube-api-access-8s5t5\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:48.245969 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.245916 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-cabundle-cert\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:48.245969 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.245928 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/77010218-e7eb-4d94-a9f0-466ca2b695f3-isvc-init-fail-6c6f0e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:48.245969 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.245938 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77010218-e7eb-4d94-a9f0-466ca2b695f3-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:48.245969 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.245967 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77010218-e7eb-4d94-a9f0-466ca2b695f3-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:17:48.547310 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.547281 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr_77010218-e7eb-4d94-a9f0-466ca2b695f3/storage-initializer/1.log" Apr 23 18:17:48.547674 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.547655 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr_77010218-e7eb-4d94-a9f0-466ca2b695f3/storage-initializer/0.log" Apr 23 18:17:48.547783 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.547695 2571 generic.go:358] "Generic (PLEG): container finished" podID="77010218-e7eb-4d94-a9f0-466ca2b695f3" containerID="845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8" exitCode=1 Apr 23 18:17:48.547837 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.547795 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" event={"ID":"77010218-e7eb-4d94-a9f0-466ca2b695f3","Type":"ContainerDied","Data":"845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8"} Apr 23 18:17:48.547837 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.547820 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" Apr 23 18:17:48.547942 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.547838 2571 scope.go:117] "RemoveContainer" containerID="845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8" Apr 23 18:17:48.547989 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.547828 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr" event={"ID":"77010218-e7eb-4d94-a9f0-466ca2b695f3","Type":"ContainerDied","Data":"26e00affa42386a0e8e593fdf6ad45ebb8c06d8b11c746dc2dd4977f15a50355"} Apr 23 18:17:48.555718 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.555702 2571 scope.go:117] "RemoveContainer" containerID="06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529" Apr 23 18:17:48.562579 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.562556 2571 scope.go:117] "RemoveContainer" containerID="845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8" Apr 23 18:17:48.562852 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:17:48.562830 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8\": container with ID starting with 845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8 not found: ID does not exist" containerID="845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8" Apr 23 18:17:48.562903 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.562868 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8"} err="failed to get container status \"845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8\": rpc error: code = NotFound desc = could not find container \"845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8\": container with ID starting with 845e6c8a40608e16bb85deaa2710343728011ac2eb4627a010ade01be2417ba8 not found: ID does not exist" Apr 23 18:17:48.562903 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.562889 2571 scope.go:117] "RemoveContainer" containerID="06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529" Apr 23 18:17:48.563126 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:17:48.563110 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529\": container with ID starting with 06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529 not found: ID does not exist" containerID="06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529" Apr 23 18:17:48.563164 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.563132 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529"} err="failed to get container status \"06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529\": rpc error: code = NotFound desc = could not find container \"06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529\": container with ID starting with 06732af94592ff3aba27fa7e3a5498ee5427daa097cffe7a48d3820e94865529 not found: ID does not exist" Apr 23 18:17:48.587300 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.587263 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr"] Apr 23 18:17:48.591829 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:48.591801 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c6f0e-predictor-df7f7d88c-kzbpr"] Apr 23 18:17:50.492315 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:50.492230 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77010218-e7eb-4d94-a9f0-466ca2b695f3" path="/var/lib/kubelet/pods/77010218-e7eb-4d94-a9f0-466ca2b695f3/volumes" Apr 23 18:17:50.557514 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:50.557481 2571 generic.go:358] "Generic (PLEG): container finished" podID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerID="0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1" exitCode=0 Apr 23 18:17:50.557642 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:50.557557 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" event={"ID":"b5fd99c8-d7d6-4e45-8d27-33545cded6ee","Type":"ContainerDied","Data":"0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1"} Apr 23 18:17:51.562459 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:51.562425 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" event={"ID":"b5fd99c8-d7d6-4e45-8d27-33545cded6ee","Type":"ContainerStarted","Data":"2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8"} Apr 23 18:17:51.562459 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:51.562465 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" event={"ID":"b5fd99c8-d7d6-4e45-8d27-33545cded6ee","Type":"ContainerStarted","Data":"892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635"} Apr 23 18:17:51.562961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:51.562798 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:51.562961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:51.562930 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:51.564266 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:51.564241 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 18:17:51.580668 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:51.580616 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podStartSLOduration=6.5806032 podStartE2EDuration="6.5806032s" podCreationTimestamp="2026-04-23 18:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:17:51.580221766 +0000 UTC m=+2207.658793744" watchObservedRunningTime="2026-04-23 18:17:51.5806032 +0000 UTC m=+2207.659175177" Apr 23 18:17:52.565513 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:52.565473 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 18:17:57.570330 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:57.570302 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:17:57.570879 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:17:57.570856 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 18:18:07.571840 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:18:07.571795 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 18:18:17.571192 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:18:17.571152 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 18:18:27.571419 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:18:27.571326 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 18:18:37.570992 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:18:37.570946 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 18:18:47.570963 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:18:47.570926 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 18:18:57.571588 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:18:57.571559 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:19:05.832433 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.832401 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp"] Apr 23 18:19:05.832893 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.832703 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" containerID="cri-o://892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635" gracePeriod=30 Apr 23 18:19:05.832893 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.832806 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kube-rbac-proxy" containerID="cri-o://2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8" gracePeriod=30 Apr 23 18:19:05.937557 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.937517 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g"] Apr 23 18:19:05.937833 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.937816 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77010218-e7eb-4d94-a9f0-466ca2b695f3" containerName="storage-initializer" Apr 23 18:19:05.937918 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.937835 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="77010218-e7eb-4d94-a9f0-466ca2b695f3" containerName="storage-initializer" Apr 23 18:19:05.937918 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.937849 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77010218-e7eb-4d94-a9f0-466ca2b695f3" containerName="storage-initializer" Apr 23 18:19:05.937918 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.937855 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="77010218-e7eb-4d94-a9f0-466ca2b695f3" containerName="storage-initializer" Apr 23 18:19:05.938026 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.937929 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="77010218-e7eb-4d94-a9f0-466ca2b695f3" containerName="storage-initializer" Apr 23 18:19:05.938026 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.938014 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="77010218-e7eb-4d94-a9f0-466ca2b695f3" containerName="storage-initializer" Apr 23 18:19:05.939981 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.939963 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:05.942120 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.942093 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-8a906-predictor-serving-cert\"" Apr 23 18:19:05.942251 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.942132 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\"" Apr 23 18:19:05.953080 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:05.953056 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g"] Apr 23 18:19:06.001239 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.001202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-proxy-tls\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.001239 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.001241 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.001469 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.001315 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kserve-provision-location\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.001469 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.001367 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgrrf\" (UniqueName: \"kubernetes.io/projected/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kube-api-access-pgrrf\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.102286 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.102170 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kserve-provision-location\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.102286 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.102229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgrrf\" (UniqueName: \"kubernetes.io/projected/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kube-api-access-pgrrf\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.102286 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.102274 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-proxy-tls\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.102574 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.102297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.102574 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:19:06.102421 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-serving-cert: secret "raw-sklearn-runtime-8a906-predictor-serving-cert" not found Apr 23 18:19:06.102574 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:19:06.102502 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-proxy-tls podName:6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5 nodeName:}" failed. No retries permitted until 2026-04-23 18:19:06.602487244 +0000 UTC m=+2282.681059200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-proxy-tls") pod "raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" (UID: "6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5") : secret "raw-sklearn-runtime-8a906-predictor-serving-cert" not found Apr 23 18:19:06.102717 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.102620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kserve-provision-location\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.102898 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.102880 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.111498 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.111476 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgrrf\" (UniqueName: \"kubernetes.io/projected/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kube-api-access-pgrrf\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.604295 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.604255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-proxy-tls\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.606875 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.606851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-proxy-tls\") pod \"raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:06.766570 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.766537 2571 generic.go:358] "Generic (PLEG): container finished" podID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerID="2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8" exitCode=2 Apr 23 18:19:06.766729 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.766610 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" event={"ID":"b5fd99c8-d7d6-4e45-8d27-33545cded6ee","Type":"ContainerDied","Data":"2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8"} Apr 23 18:19:06.850245 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:06.850205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:07.000745 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:07.000708 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g"] Apr 23 18:19:07.004553 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:19:07.004524 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bcf381d_87b1_4c8e_ae3b_df1e9b35d6d5.slice/crio-3aef3f613ee02656ebcb18cd9c57c2284caad79779a4dfdbf4a8678e9da5fc34 WatchSource:0}: Error finding container 3aef3f613ee02656ebcb18cd9c57c2284caad79779a4dfdbf4a8678e9da5fc34: Status 404 returned error can't find the container with id 3aef3f613ee02656ebcb18cd9c57c2284caad79779a4dfdbf4a8678e9da5fc34 Apr 23 18:19:07.566369 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:07.566327 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.26:8643/healthz\": dial tcp 10.132.0.26:8643: connect: connection refused" Apr 23 18:19:07.571726 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:07.571691 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 18:19:07.770615 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:07.770577 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" event={"ID":"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5","Type":"ContainerStarted","Data":"8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f"} Apr 23 18:19:07.770615 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:07.770621 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" event={"ID":"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5","Type":"ContainerStarted","Data":"3aef3f613ee02656ebcb18cd9c57c2284caad79779a4dfdbf4a8678e9da5fc34"} Apr 23 18:19:10.466476 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.466450 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:19:10.531213 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.531171 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-802ca-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-raw-sklearn-802ca-kube-rbac-proxy-sar-config\") pod \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " Apr 23 18:19:10.531213 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.531212 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kserve-provision-location\") pod \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " Apr 23 18:19:10.531436 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.531232 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hh97\" (UniqueName: \"kubernetes.io/projected/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kube-api-access-4hh97\") pod \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " Apr 23 18:19:10.531436 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.531257 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-proxy-tls\") pod \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\" (UID: \"b5fd99c8-d7d6-4e45-8d27-33545cded6ee\") " Apr 23 18:19:10.531611 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.531580 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5fd99c8-d7d6-4e45-8d27-33545cded6ee" (UID: "b5fd99c8-d7d6-4e45-8d27-33545cded6ee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:19:10.531651 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.531603 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-raw-sklearn-802ca-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-802ca-kube-rbac-proxy-sar-config") pod "b5fd99c8-d7d6-4e45-8d27-33545cded6ee" (UID: "b5fd99c8-d7d6-4e45-8d27-33545cded6ee"). InnerVolumeSpecName "raw-sklearn-802ca-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:19:10.533464 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.533440 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b5fd99c8-d7d6-4e45-8d27-33545cded6ee" (UID: "b5fd99c8-d7d6-4e45-8d27-33545cded6ee"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:19:10.533582 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.533567 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kube-api-access-4hh97" (OuterVolumeSpecName: "kube-api-access-4hh97") pod "b5fd99c8-d7d6-4e45-8d27-33545cded6ee" (UID: "b5fd99c8-d7d6-4e45-8d27-33545cded6ee"). InnerVolumeSpecName "kube-api-access-4hh97". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:19:10.632670 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.632582 2571 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-802ca-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-raw-sklearn-802ca-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:19:10.632670 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.632615 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:19:10.632670 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.632626 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hh97\" (UniqueName: \"kubernetes.io/projected/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-kube-api-access-4hh97\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:19:10.632670 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.632638 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5fd99c8-d7d6-4e45-8d27-33545cded6ee-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:19:10.781409 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.781371 2571 generic.go:358] "Generic (PLEG): container finished" podID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerID="892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635" exitCode=0 Apr 23 18:19:10.781564 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.781455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" event={"ID":"b5fd99c8-d7d6-4e45-8d27-33545cded6ee","Type":"ContainerDied","Data":"892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635"} Apr 23 18:19:10.781564 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.781505 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" event={"ID":"b5fd99c8-d7d6-4e45-8d27-33545cded6ee","Type":"ContainerDied","Data":"c5ea8f8a7a00d21d8e6db22b570ca8d813624032c9d9cb6ec3266fc934d17f86"} Apr 23 18:19:10.781564 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.781531 2571 scope.go:117] "RemoveContainer" containerID="2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8" Apr 23 18:19:10.781564 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.781464 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp" Apr 23 18:19:10.789854 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.789833 2571 scope.go:117] "RemoveContainer" containerID="892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635" Apr 23 18:19:10.796794 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.796759 2571 scope.go:117] "RemoveContainer" containerID="0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1" Apr 23 18:19:10.802927 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.802895 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp"] Apr 23 18:19:10.803989 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.803972 2571 scope.go:117] "RemoveContainer" containerID="2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8" Apr 23 18:19:10.804290 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:19:10.804267 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8\": container with ID starting with 2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8 not found: ID does not exist" containerID="2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8" Apr 23 18:19:10.804382 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.804298 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8"} err="failed to get container status \"2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8\": rpc error: code = NotFound desc = could not find container \"2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8\": container with ID starting with 2fccdf87aeada90cde7409535754b045f9d6aaf90dba943bc4ad1097a91f81d8 not found: ID does not exist" Apr 23 18:19:10.804382 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.804320 2571 scope.go:117] "RemoveContainer" containerID="892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635" Apr 23 18:19:10.804575 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:19:10.804557 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635\": container with ID starting with 892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635 not found: ID does not exist" containerID="892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635" Apr 23 18:19:10.804645 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.804585 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635"} err="failed to get container status \"892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635\": rpc error: code = NotFound desc = could not find container \"892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635\": container with ID starting with 892054f17eefdf5514eb5d1104763c6c363dbeca7b5cf63b250e916b6471d635 not found: ID does not exist" Apr 23 18:19:10.804645 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.804609 2571 scope.go:117] "RemoveContainer" containerID="0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1" Apr 23 18:19:10.804955 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:19:10.804937 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1\": container with ID starting with 0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1 not found: ID does not exist" containerID="0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1" Apr 23 18:19:10.805012 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.804960 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1"} err="failed to get container status \"0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1\": rpc error: code = NotFound desc = could not find container \"0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1\": container with ID starting with 0aa487954ed686f59543ba02335ede85364fd1d7ac54ff53cd475b9652bfb8a1 not found: ID does not exist" Apr 23 18:19:10.806966 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:10.806944 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-802ca-predictor-6f89f97cd7-nxhdp"] Apr 23 18:19:11.786880 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:11.786841 2571 generic.go:358] "Generic (PLEG): container finished" podID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerID="8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f" exitCode=0 Apr 23 18:19:11.787259 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:11.786913 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" event={"ID":"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5","Type":"ContainerDied","Data":"8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f"} Apr 23 18:19:12.492676 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:12.492640 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" path="/var/lib/kubelet/pods/b5fd99c8-d7d6-4e45-8d27-33545cded6ee/volumes" Apr 23 18:19:12.791451 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:12.791418 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" event={"ID":"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5","Type":"ContainerStarted","Data":"dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f"} Apr 23 18:19:12.791818 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:12.791460 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" event={"ID":"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5","Type":"ContainerStarted","Data":"7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc"} Apr 23 18:19:12.791818 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:12.791796 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:12.791930 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:12.791913 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:12.793094 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:12.793070 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 23 18:19:12.812034 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:12.811992 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podStartSLOduration=7.81197837 podStartE2EDuration="7.81197837s" podCreationTimestamp="2026-04-23 18:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:19:12.810358187 +0000 UTC m=+2288.888930176" watchObservedRunningTime="2026-04-23 18:19:12.81197837 +0000 UTC m=+2288.890550348" Apr 23 18:19:13.794848 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:13.794802 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 23 18:19:18.799677 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:18.799644 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:19:18.800276 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:18.800249 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 23 18:19:28.800671 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:28.800623 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 23 18:19:38.800522 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:38.800471 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 23 18:19:48.800360 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:48.800318 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 23 18:19:58.800243 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:19:58.800204 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 23 18:20:08.800188 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:08.800142 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 23 18:20:18.801462 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:18.801433 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:20:26.021764 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:26.021729 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g"] Apr 23 18:20:26.022297 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:26.022060 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" containerID="cri-o://7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc" gracePeriod=30 Apr 23 18:20:26.022297 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:26.022093 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kube-rbac-proxy" containerID="cri-o://dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f" gracePeriod=30 Apr 23 18:20:26.997909 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:26.997873 2571 generic.go:358] "Generic (PLEG): container finished" podID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerID="dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f" exitCode=2 Apr 23 18:20:26.998072 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:26.997927 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" event={"ID":"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5","Type":"ContainerDied","Data":"dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f"} Apr 23 18:20:28.795842 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:28.795790 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.27:8643/healthz\": dial tcp 10.132.0.27:8643: connect: connection refused" Apr 23 18:20:28.801163 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:28.801130 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 23 18:20:30.457435 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.457412 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:20:30.617457 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.617363 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kserve-provision-location\") pod \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " Apr 23 18:20:30.617457 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.617409 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-proxy-tls\") pod \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " Apr 23 18:20:30.617457 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.617457 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\") pod \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " Apr 23 18:20:30.617744 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.617495 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgrrf\" (UniqueName: \"kubernetes.io/projected/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kube-api-access-pgrrf\") pod \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\" (UID: \"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5\") " Apr 23 18:20:30.617845 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.617743 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" (UID: "6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:20:30.617938 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.617912 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config") pod "6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" (UID: "6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5"). InnerVolumeSpecName "raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:20:30.619818 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.619763 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" (UID: "6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:20:30.619921 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.619814 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kube-api-access-pgrrf" (OuterVolumeSpecName: "kube-api-access-pgrrf") pod "6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" (UID: "6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5"). InnerVolumeSpecName "kube-api-access-pgrrf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:20:30.718500 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.718470 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgrrf\" (UniqueName: \"kubernetes.io/projected/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kube-api-access-pgrrf\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:20:30.718500 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.718497 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-kserve-provision-location\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:20:30.718686 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.718509 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-proxy-tls\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:20:30.718686 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:30.718521 2571 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5-raw-sklearn-runtime-8a906-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-17.ec2.internal\" DevicePath \"\"" Apr 23 18:20:31.009185 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.009153 2571 generic.go:358] "Generic (PLEG): container finished" podID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerID="7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc" exitCode=0 Apr 23 18:20:31.009350 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.009197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" event={"ID":"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5","Type":"ContainerDied","Data":"7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc"} Apr 23 18:20:31.009350 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.009219 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" event={"ID":"6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5","Type":"ContainerDied","Data":"3aef3f613ee02656ebcb18cd9c57c2284caad79779a4dfdbf4a8678e9da5fc34"} Apr 23 18:20:31.009350 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.009234 2571 scope.go:117] "RemoveContainer" containerID="dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f" Apr 23 18:20:31.009350 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.009234 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g" Apr 23 18:20:31.017445 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.017428 2571 scope.go:117] "RemoveContainer" containerID="7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc" Apr 23 18:20:31.024434 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.024413 2571 scope.go:117] "RemoveContainer" containerID="8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f" Apr 23 18:20:31.031474 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.031455 2571 scope.go:117] "RemoveContainer" containerID="dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f" Apr 23 18:20:31.031726 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:20:31.031709 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f\": container with ID starting with dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f not found: ID does not exist" containerID="dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f" Apr 23 18:20:31.031790 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.031734 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f"} err="failed to get container status \"dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f\": rpc error: code = NotFound desc = could not find container \"dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f\": container with ID starting with dde467462d90bcd83e9fa84c059960ae3f7105c41f2fa6790570f32d9485a55f not found: ID does not exist" Apr 23 18:20:31.031790 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.031752 2571 scope.go:117] "RemoveContainer" containerID="7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc" Apr 23 18:20:31.032008 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:20:31.031989 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc\": container with ID starting with 7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc not found: ID does not exist" containerID="7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc" Apr 23 18:20:31.032062 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.032014 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc"} err="failed to get container status \"7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc\": rpc error: code = NotFound desc = could not find container \"7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc\": container with ID starting with 7968a90dc32fb4f9bf156adda9967ad7dd7266a4fd697568e58425b64ffc4ccc not found: ID does not exist" Apr 23 18:20:31.032062 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.032031 2571 scope.go:117] "RemoveContainer" containerID="8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f" Apr 23 18:20:31.032240 ip-10-0-138-17 kubenswrapper[2571]: E0423 18:20:31.032223 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f\": container with ID starting with 8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f not found: ID does not exist" containerID="8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f" Apr 23 18:20:31.032279 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.032247 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f"} err="failed to get container status \"8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f\": rpc error: code = NotFound desc = could not find container \"8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f\": container with ID starting with 8260ff64af06ccada2b5571a76089982736354f16c59bc3cd447677d3c3f8a2f not found: ID does not exist" Apr 23 18:20:31.035308 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.035287 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g"] Apr 23 18:20:31.038048 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:31.038028 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8a906-predictor-6ddcd769bf-fh92g"] Apr 23 18:20:32.491757 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:32.491721 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" path="/var/lib/kubelet/pods/6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5/volumes" Apr 23 18:20:50.948466 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948427 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gtcqp/must-gather-ws4mc"] Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948706 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948720 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948734 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kube-rbac-proxy" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948739 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kube-rbac-proxy" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948746 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="storage-initializer" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948752 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="storage-initializer" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948763 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948796 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948802 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="storage-initializer" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948808 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="storage-initializer" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948816 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kube-rbac-proxy" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948821 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kube-rbac-proxy" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948861 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kserve-container" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948868 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bcf381d-87b1-4c8e-ae3b-df1e9b35d6d5" containerName="kube-rbac-proxy" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948874 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kserve-container" Apr 23 18:20:50.948961 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.948882 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5fd99c8-d7d6-4e45-8d27-33545cded6ee" containerName="kube-rbac-proxy" Apr 23 18:20:50.951498 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.951480 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtcqp/must-gather-ws4mc" Apr 23 18:20:50.953452 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.953430 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gtcqp\"/\"kube-root-ca.crt\"" Apr 23 18:20:50.953705 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.953690 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gtcqp\"/\"default-dockercfg-qkf26\"" Apr 23 18:20:50.953818 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.953692 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gtcqp\"/\"openshift-service-ca.crt\"" Apr 23 18:20:50.959211 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:50.959185 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gtcqp/must-gather-ws4mc"] Apr 23 18:20:51.066409 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:51.066380 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcj6w\" (UniqueName: \"kubernetes.io/projected/d93c8862-a352-498f-a900-132a9865c76b-kube-api-access-kcj6w\") pod \"must-gather-ws4mc\" (UID: \"d93c8862-a352-498f-a900-132a9865c76b\") " pod="openshift-must-gather-gtcqp/must-gather-ws4mc" Apr 23 18:20:51.066579 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:51.066422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d93c8862-a352-498f-a900-132a9865c76b-must-gather-output\") pod \"must-gather-ws4mc\" (UID: \"d93c8862-a352-498f-a900-132a9865c76b\") " pod="openshift-must-gather-gtcqp/must-gather-ws4mc" Apr 23 18:20:51.167417 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:51.167380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d93c8862-a352-498f-a900-132a9865c76b-must-gather-output\") pod \"must-gather-ws4mc\" (UID: \"d93c8862-a352-498f-a900-132a9865c76b\") " pod="openshift-must-gather-gtcqp/must-gather-ws4mc" Apr 23 18:20:51.167570 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:51.167453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcj6w\" (UniqueName: \"kubernetes.io/projected/d93c8862-a352-498f-a900-132a9865c76b-kube-api-access-kcj6w\") pod \"must-gather-ws4mc\" (UID: \"d93c8862-a352-498f-a900-132a9865c76b\") " pod="openshift-must-gather-gtcqp/must-gather-ws4mc" Apr 23 18:20:51.167737 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:51.167716 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d93c8862-a352-498f-a900-132a9865c76b-must-gather-output\") pod \"must-gather-ws4mc\" (UID: \"d93c8862-a352-498f-a900-132a9865c76b\") " pod="openshift-must-gather-gtcqp/must-gather-ws4mc" Apr 23 18:20:51.177491 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:51.177461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcj6w\" (UniqueName: \"kubernetes.io/projected/d93c8862-a352-498f-a900-132a9865c76b-kube-api-access-kcj6w\") pod \"must-gather-ws4mc\" (UID: \"d93c8862-a352-498f-a900-132a9865c76b\") " pod="openshift-must-gather-gtcqp/must-gather-ws4mc" Apr 23 18:20:51.260617 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:51.260586 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtcqp/must-gather-ws4mc" Apr 23 18:20:51.386868 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:51.386819 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gtcqp/must-gather-ws4mc"] Apr 23 18:20:51.391425 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:20:51.391391 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd93c8862_a352_498f_a900_132a9865c76b.slice/crio-ac2fe1bee2cc51aa2d30a7d1acd2dc1d1b5f9376d7b9e4768644263bc1e1f353 WatchSource:0}: Error finding container ac2fe1bee2cc51aa2d30a7d1acd2dc1d1b5f9376d7b9e4768644263bc1e1f353: Status 404 returned error can't find the container with id ac2fe1bee2cc51aa2d30a7d1acd2dc1d1b5f9376d7b9e4768644263bc1e1f353 Apr 23 18:20:52.063448 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:52.063405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtcqp/must-gather-ws4mc" event={"ID":"d93c8862-a352-498f-a900-132a9865c76b","Type":"ContainerStarted","Data":"ac2fe1bee2cc51aa2d30a7d1acd2dc1d1b5f9376d7b9e4768644263bc1e1f353"} Apr 23 18:20:52.317126 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:52.317099 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:20:53.067940 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:53.067895 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtcqp/must-gather-ws4mc" event={"ID":"d93c8862-a352-498f-a900-132a9865c76b","Type":"ContainerStarted","Data":"c080437c6ade83784b62057f455bd8c9f3a189cda8c8861cb7b411a0cf6680c7"} Apr 23 18:20:53.067940 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:53.067943 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtcqp/must-gather-ws4mc" event={"ID":"d93c8862-a352-498f-a900-132a9865c76b","Type":"ContainerStarted","Data":"e1395bec6fb9d5debb0d3edbfcb60ad206d308bbd3dd0a3a107c33966a7fdd30"} Apr 23 18:20:53.084587 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:53.084523 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gtcqp/must-gather-ws4mc" podStartSLOduration=2.277926191 podStartE2EDuration="3.084500165s" podCreationTimestamp="2026-04-23 18:20:50 +0000 UTC" firstStartedPulling="2026-04-23 18:20:51.393170126 +0000 UTC m=+2387.471742081" lastFinishedPulling="2026-04-23 18:20:52.199744083 +0000 UTC m=+2388.278316055" observedRunningTime="2026-04-23 18:20:53.082711506 +0000 UTC m=+2389.161283512" watchObservedRunningTime="2026-04-23 18:20:53.084500165 +0000 UTC m=+2389.163072144" Apr 23 18:20:53.815383 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:53.815349 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hqlck_e92c79cf-5cbc-4aab-b574-cecf28cb3b0f/global-pull-secret-syncer/0.log" Apr 23 18:20:53.819509 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:53.819485 2571 ???:1] "http: TLS handshake error from 10.0.138.17:35372: EOF" Apr 23 18:20:53.944086 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:53.944047 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-79jpg_929358d6-5b4a-49e4-a824-b3fbdf245db8/konnectivity-agent/0.log" Apr 23 18:20:54.098325 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:54.098247 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-17.ec2.internal_40e393938119b34b55859b23c603acc7/haproxy/0.log" Apr 23 18:20:58.022616 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:58.022582 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s9n68_37834963-67dc-4f7f-b7c1-31adea238b05/node-exporter/0.log" Apr 23 18:20:58.047388 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:58.047353 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s9n68_37834963-67dc-4f7f-b7c1-31adea238b05/kube-rbac-proxy/0.log" Apr 23 18:20:58.072238 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:20:58.072204 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s9n68_37834963-67dc-4f7f-b7c1-31adea238b05/init-textfile/0.log" Apr 23 18:21:01.083271 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.083234 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck"] Apr 23 18:21:01.087039 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.087016 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.095078 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.095043 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck"] Apr 23 18:21:01.161146 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.161113 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-proc\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.161335 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.161152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-lib-modules\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.161335 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.161196 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzbt9\" (UniqueName: \"kubernetes.io/projected/cf9a6450-be95-4c6c-931d-20a2254f6a4d-kube-api-access-mzbt9\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.161335 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.161220 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-podres\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.161335 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.161239 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-sys\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.262386 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.262352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-proc\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.262386 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.262391 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-lib-modules\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.262614 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.262417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzbt9\" (UniqueName: \"kubernetes.io/projected/cf9a6450-be95-4c6c-931d-20a2254f6a4d-kube-api-access-mzbt9\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.262614 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.262492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-proc\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.262614 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.262538 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-podres\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.262614 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.262562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-sys\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.262614 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.262594 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-podres\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.262614 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.262535 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-lib-modules\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.262837 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.262630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf9a6450-be95-4c6c-931d-20a2254f6a4d-sys\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.272017 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.271949 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzbt9\" (UniqueName: \"kubernetes.io/projected/cf9a6450-be95-4c6c-931d-20a2254f6a4d-kube-api-access-mzbt9\") pod \"perf-node-gather-daemonset-w96ck\" (UID: \"cf9a6450-be95-4c6c-931d-20a2254f6a4d\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.400108 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.400004 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:01.534893 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:01.534852 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck"] Apr 23 18:21:01.542279 ip-10-0-138-17 kubenswrapper[2571]: W0423 18:21:01.542246 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcf9a6450_be95_4c6c_931d_20a2254f6a4d.slice/crio-c42cc4529dbe0f2c546c6c426341a5967885d7c9d2f4b0613258d8dffe769ae9 WatchSource:0}: Error finding container c42cc4529dbe0f2c546c6c426341a5967885d7c9d2f4b0613258d8dffe769ae9: Status 404 returned error can't find the container with id c42cc4529dbe0f2c546c6c426341a5967885d7c9d2f4b0613258d8dffe769ae9 Apr 23 18:21:02.100330 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:02.100283 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" event={"ID":"cf9a6450-be95-4c6c-931d-20a2254f6a4d","Type":"ContainerStarted","Data":"ac2b7cf0b41c3d9379c48e669d8745692562e45b73a04d62d8e3686952152810"} Apr 23 18:21:02.100788 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:02.100335 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" event={"ID":"cf9a6450-be95-4c6c-931d-20a2254f6a4d","Type":"ContainerStarted","Data":"c42cc4529dbe0f2c546c6c426341a5967885d7c9d2f4b0613258d8dffe769ae9"} Apr 23 18:21:02.100788 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:02.100374 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:02.117831 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:02.117752 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" podStartSLOduration=1.117737722 podStartE2EDuration="1.117737722s" podCreationTimestamp="2026-04-23 18:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:21:02.117403655 +0000 UTC m=+2398.195975634" watchObservedRunningTime="2026-04-23 18:21:02.117737722 +0000 UTC m=+2398.196309699" Apr 23 18:21:02.144288 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:02.144255 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j4f7r_245df2e7-7ac7-458b-a0e7-7f1121debc71/dns/0.log" Apr 23 18:21:02.167394 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:02.167364 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j4f7r_245df2e7-7ac7-458b-a0e7-7f1121debc71/kube-rbac-proxy/0.log" Apr 23 18:21:02.248546 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:02.248512 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cxr7m_33946126-5bdc-4047-b09e-8ca68acdbd65/dns-node-resolver/0.log" Apr 23 18:21:02.815499 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:02.815466 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sdzf6_7acbf804-17f1-4ca5-8fe0-94cc15d6bf7f/node-ca/0.log" Apr 23 18:21:03.975706 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:03.975674 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4txc2_9840772f-9282-45fa-a2c3-d4bfeab937c3/serve-healthcheck-canary/0.log" Apr 23 18:21:04.430780 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:04.430743 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4jc8x_f58777e7-8e0a-42a0-ae61-087a5301da18/kube-rbac-proxy/0.log" Apr 23 18:21:04.453393 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:04.453364 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4jc8x_f58777e7-8e0a-42a0-ae61-087a5301da18/exporter/0.log" Apr 23 18:21:04.478369 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:04.478335 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4jc8x_f58777e7-8e0a-42a0-ae61-087a5301da18/extractor/0.log" Apr 23 18:21:04.503001 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:04.502903 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:21:04.504801 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:04.504755 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:21:06.687707 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:06.687676 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-xzp9x_0573ddb1-4af0-4efa-8816-2ba7405c2617/manager/0.log" Apr 23 18:21:06.857572 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:06.857537 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-m6rqt_44015263-b752-4776-bca1-9a63bf498d52/seaweedfs/0.log" Apr 23 18:21:08.116685 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:08.116654 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-w96ck" Apr 23 18:21:12.504523 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:12.504475 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4xrxl_ab714fa9-cc35-46b7-9076-b5623bd67831/kube-multus/0.log" Apr 23 18:21:12.906024 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:12.905943 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t79h9_adddecdf-d1af-4726-baac-6b7ff1828f40/kube-multus-additional-cni-plugins/0.log" Apr 23 18:21:12.928007 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:12.927977 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t79h9_adddecdf-d1af-4726-baac-6b7ff1828f40/egress-router-binary-copy/0.log" Apr 23 18:21:12.952984 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:12.952958 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t79h9_adddecdf-d1af-4726-baac-6b7ff1828f40/cni-plugins/0.log" Apr 23 18:21:12.977875 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:12.977837 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t79h9_adddecdf-d1af-4726-baac-6b7ff1828f40/bond-cni-plugin/0.log" Apr 23 18:21:13.001833 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:13.001800 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t79h9_adddecdf-d1af-4726-baac-6b7ff1828f40/routeoverride-cni/0.log" Apr 23 18:21:13.028853 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:13.028822 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t79h9_adddecdf-d1af-4726-baac-6b7ff1828f40/whereabouts-cni-bincopy/0.log" Apr 23 18:21:13.051042 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:13.051011 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t79h9_adddecdf-d1af-4726-baac-6b7ff1828f40/whereabouts-cni/0.log" Apr 23 18:21:13.232930 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:13.232848 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qzv7h_9d9e614d-f61d-4fcd-aaaf-ab97f54f2487/network-metrics-daemon/0.log" Apr 23 18:21:13.253809 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:13.253761 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qzv7h_9d9e614d-f61d-4fcd-aaaf-ab97f54f2487/kube-rbac-proxy/0.log" Apr 23 18:21:14.333854 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:14.333825 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-controller/0.log" Apr 23 18:21:14.354574 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:14.354535 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/0.log" Apr 23 18:21:14.365868 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:14.365839 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovn-acl-logging/1.log" Apr 23 18:21:14.388626 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:14.388598 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/kube-rbac-proxy-node/0.log" Apr 23 18:21:14.415251 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:14.415220 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 18:21:14.436459 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:14.436422 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/northd/0.log" Apr 23 18:21:14.459044 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:14.459017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/nbdb/0.log" Apr 23 18:21:14.486092 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:14.486062 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/sbdb/0.log" Apr 23 18:21:14.616229 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:14.616149 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cqkbt_de5b0280-3d7b-48bb-b05d-befd13392325/ovnkube-controller/0.log" Apr 23 18:21:16.211828 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:16.211797 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-8rrlh_9c2a48b6-a8f6-4813-bb00-957aa2486e5e/network-check-target-container/0.log" Apr 23 18:21:17.400620 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:17.400593 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-spvtt_0f79f8f5-4fe2-436d-82d4-a94a0d58ba45/iptables-alerter/0.log" Apr 23 18:21:18.141560 ip-10-0-138-17 kubenswrapper[2571]: I0423 18:21:18.141534 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7tx9v_4e3268bc-024b-451b-9f80-42d8dd401c8e/tuned/0.log"